Alternatives to Apache Sentry
Compare Apache Sentry alternatives for your business or organization using the curated list below. SourceForge ranks the best alternatives to Apache Sentry in 2026. Compare features, ratings, user reviews, pricing, and more from Apache Sentry competitors and alternatives in order to make an informed decision for your business.
-
1
Apache Impala
Apache
Impala provides low latency and high concurrency for BI/analytic queries on the Hadoop ecosystem, including Iceberg, open data formats, and most cloud storage options. Impala also scales linearly, even in multitenant environments. Impala is integrated with native Hadoop security and Kerberos for authentication, and via the Ranger module, you can ensure that the right users and applications are authorized for the right data. Utilize the same file and data formats and metadata, security, and resource management frameworks as your Hadoop deployment, with no redundant infrastructure or data conversion/duplication. For Apache Hive users, Impala utilizes the same metadata and ODBC driver. Like Hive, Impala supports SQL, so you don't have to worry about reinventing the implementation wheel. With Impala, more users, whether using SQL queries or BI applications, can interact with more data through a single repository and metadata stored from source through analysis.Starting Price: Free -
2
Apache Ranger
The Apache Software Foundation
Apache Ranger™ is a framework to enable, monitor and manage comprehensive data security across the Hadoop platform. The vision with Ranger is to provide comprehensive security across the Apache Hadoop ecosystem. With the advent of Apache YARN, the Hadoop platform can now support a true data lake architecture. Enterprises can potentially run multiple workloads, in a multi tenant environment. Data security within Hadoop needs to evolve to support multiple use cases for data access, while also providing a framework for central administration of security policies and monitoring of user access. Centralized security administration to manage all security related tasks in a central UI or using REST APIs. Fine grained authorization to do a specific action and/or operation with Hadoop component/tool and managed through a central administration tool. Standardize authorization method across all Hadoop components. Enhanced support for different authorization methods - Role based access control etc. -
3
Apache Hive
Apache Software Foundation
The Apache Hive data warehouse software facilitates reading, writing, and managing large datasets residing in distributed storage using SQL. Structure can be projected onto data already in storage. A command line tool and JDBC driver are provided to connect users to Hive. Apache Hive is an open source project run by volunteers at the Apache Software Foundation. Previously it was a subproject of Apache® Hadoop®, but has now graduated to become a top-level project of its own. We encourage you to learn about the project and contribute your expertise. Traditional SQL queries must be implemented in the MapReduce Java API to execute SQL applications and queries over distributed data. Hive provides the necessary SQL abstraction to integrate SQL-like queries (HiveQL) into the underlying Java without the need to implement queries in the low-level Java API. -
4
E-MapReduce
Alibaba
EMR is an all-in-one enterprise-ready big data platform that provides cluster, job, and data management services based on open-source ecosystems, such as Hadoop, Spark, Kafka, Flink, and Storm. Alibaba Cloud Elastic MapReduce (EMR) is a big data processing solution that runs on the Alibaba Cloud platform. EMR is built on Alibaba Cloud ECS instances and is based on open-source Apache Hadoop and Apache Spark. EMR allows you to use the Hadoop and Spark ecosystem components, such as Apache Hive, Apache Kafka, Flink, Druid, and TensorFlow, to analyze and process data. You can use EMR to process data stored on different Alibaba Cloud data storage service, such as Object Storage Service (OSS), Log Service (SLS), and Relational Database Service (RDS). You can quickly create clusters without the need to configure hardware and software. All maintenance operations are completed on its Web interface. -
5
Apache Knox
Apache Software Foundation
The Knox API Gateway is designed as a reverse proxy with consideration for pluggability in the areas of policy enforcement, through providers and the backend services for which it proxies requests. Policy enforcement ranges from authentication/federation, authorization, audit, dispatch, hostmapping and content rewrite rules. Policy is enforced through a chain of providers that are defined within the topology deployment descriptor for each Apache Hadoop cluster gated by Knox. The cluster definition is also defined within the topology deployment descriptor and provides the Knox Gateway with the layout of the cluster for purposes of routing and translation between user facing URLs and cluster internals. Each Apache Hadoop cluster that is protected by Knox has its set of REST APIs represented by a single cluster specific application context path. This allows the Knox Gateway to both protect multiple clusters and present the REST API consumer with a single endpoint. -
6
Apache Spark
Apache Software Foundation
Apache Spark™ is a unified analytics engine for large-scale data processing. Apache Spark achieves high performance for both batch and streaming data, using a state-of-the-art DAG scheduler, a query optimizer, and a physical execution engine. Spark offers over 80 high-level operators that make it easy to build parallel apps. And you can use it interactively from the Scala, Python, R, and SQL shells. Spark powers a stack of libraries including SQL and DataFrames, MLlib for machine learning, GraphX, and Spark Streaming. You can combine these libraries seamlessly in the same application. Spark runs on Hadoop, Apache Mesos, Kubernetes, standalone, or in the cloud. It can access diverse data sources. You can run Spark using its standalone cluster mode, on EC2, on Hadoop YARN, on Mesos, or on Kubernetes. Access data in HDFS, Alluxio, Apache Cassandra, Apache HBase, Apache Hive, and hundreds of other data sources. -
7
Oracle Big Data SQL Cloud Service enables organizations to immediately analyze data across Apache Hadoop, NoSQL and Oracle Database leveraging their existing SQL skills, security policies and applications with extreme performance. From simplifying data science efforts to unlocking data lakes, Big Data SQL makes the benefits of Big Data available to the largest group of end users possible. Big Data SQL gives users a single location to catalog and secure data in Hadoop and NoSQL systems, Oracle Database. Seamless metadata integration and queries which join data from Oracle Database with data from Hadoop and NoSQL databases. Utilities and conversion routines support automatic mappings from metadata stored in HCatalog (or the Hive Metastore) to Oracle Tables. Enhanced access parameters give administrators the flexibility to control column mapping and data access behavior. Multiple cluster support enables one Oracle Database to query multiple Hadoop clusters and/or NoSQL systems.
-
8
Apache Phoenix
Apache Software Foundation
Apache Phoenix enables OLTP and operational analytics in Hadoop for low-latency applications by combining the best of both worlds. The power of standard SQL and JDBC APIs with full ACID transaction capabilities and the flexibility of late-bound, schema-on-read capabilities from the NoSQL world by leveraging HBase as its backing store. Apache Phoenix is fully integrated with other Hadoop products such as Spark, Hive, Pig, Flume, and Map Reduce. Become the trusted data platform for OLTP and operational analytics for Hadoop through well-defined, industry-standard APIs. Apache Phoenix takes your SQL query, compiles it into a series of HBase scans, and orchestrates the running of those scans to produce regular JDBC result sets. Direct use of the HBase API, along with coprocessors and custom filters, results in performance on the order of milliseconds for small queries, or seconds for tens of millions of rows.Starting Price: Free -
9
Hadoop
Apache Software Foundation
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures. A wide variety of companies and organizations use Hadoop for both research and production. Users are encouraged to add themselves to the Hadoop PoweredBy wiki page. Apache Hadoop 3.3.4 incorporates a number of significant enhancements over the previous major release line (hadoop-3.2). -
10
Apache Trafodion
Apache Software Foundation
Apache Trafodion is a webscale SQL-on-Hadoop solution enabling transactional or operational workloads on Apache Hadoop. Trafodion builds on the scalability, elasticity, and flexibility of Hadoop. Trafodion extends Hadoop to provide guaranteed transactional integrity, enabling new kinds of big data applications to run on Hadoop. Full-functioned ANSI SQL language support. JDBC/ODBC connectivity for Linux/Windows clients. Distributed ACID transaction protection across multiple statements, tables, and rows. Performance improvements for OLTP workloads with compile-time and run-time optimizations. Support for large data sets using a parallel-aware query optimizer. Reuse existing SQL skills and improve developer productivity. Distributed ACID transactions guarantee data consistency across multiple rows and tables. Interoperability with existing tools and applications. Hadoop and Linux distribution neutral. Easy to add to your existing Hadoop infrastructure.Starting Price: Free -
11
Apache Mahout
Apache Software Foundation
Apache Mahout is a powerful, scalable, and versatile machine learning library designed for distributed data processing. It offers a comprehensive set of algorithms for various tasks, including classification, clustering, recommendation, and pattern mining. Built on top of the Apache Hadoop ecosystem, Mahout leverages MapReduce and Spark to enable data processing on large-scale datasets. Apache Mahout(TM) is a distributed linear algebra framework and mathematically expressive Scala DSL designed to let mathematicians, statisticians, and data scientists quickly implement their own algorithms. Apache Spark is the recommended out-of-the-box distributed back-end or can be extended to other distributed backends. Matrix computations are a fundamental part of many scientific and engineering applications, including machine learning, computer vision, and data analysis. Apache Mahout is designed to handle large-scale data processing by leveraging the power of Hadoop and Spark. -
12
Azure HDInsight
Microsoft
Run popular open-source frameworks—including Apache Hadoop, Spark, Hive, Kafka, and more—using Azure HDInsight, a customizable, enterprise-grade service for open-source analytics. Effortlessly process massive amounts of data and get all the benefits of the broad open-source project ecosystem with the global scale of Azure. Easily migrate your big data workloads and processing to the cloud. Open-source projects and clusters are easy to spin up quickly without the need to install hardware or manage infrastructure. Big data clusters reduce costs through autoscaling and pricing tiers that allow you to pay for only what you use. Enterprise-grade security and industry-leading compliance with more than 30 certifications helps protect your data. Optimized components for open-source technologies such as Hadoop and Spark keep you up to date. -
13
IBM Analytics Engine provides an architecture for Hadoop clusters that decouples the compute and storage tiers. Instead of a permanent cluster formed of dual-purpose nodes, the Analytics Engine allows users to store data in an object storage layer such as IBM Cloud Object Storage and spins up clusters of computing notes when needed. Separating compute from storage helps to transform the flexibility, scalability and maintainability of big data analytics platforms. Build on an ODPi compliant stack with pioneering data science tools with the broader Apache Hadoop and Apache Spark ecosystem. Define clusters based on your application's requirements. Choose the appropriate software pack, version, and size of the cluster. Use as long as required and delete as soon as an application finishes jobs. Configure clusters with third-party analytics libraries and packages. Deploy workloads from IBM Cloud services like machine learning.Starting Price: $0.014 per hour
-
14
ZetaAnalytics
Halliburton
The ZetaAnalytics product requires a compatible database appliance for its Data Warehouse. Landmark has qualified the ZetaAnalytics software using Teradata, EMC Greenplum, and IBM Netezza. Please see the ZetaAnalytics Release Notes for the most up to date qualified versions. Before installing and configuring ZetaAnalytics software, ensure that the Data Warehouse you use for drilling data is created and running. Scripts to create the various Zeta-specific database components within the Data Warehouse will need to be run as part of the installation process. These require database administrator (DBA) rights. The ZetaAnalytics product requires Apache Hadoop for model scoring and real-time streaming. If you do not already have an Apache Hadoop cluster installed in your environment, please install it before running the ZetaAnalytics installer, which will prompt you for the name and port number of your Hadoop Name Server and Map Reducer. -
15
Apache Bigtop
Apache Software Foundation
Bigtop is an Apache Foundation project for Infrastructure Engineers and Data Scientists looking for comprehensive packaging, testing, and configuration of the leading open source big data components. Bigtop supports a wide range of components/projects, including, but not limited to, Hadoop, HBase and Spark. Bigtop packages Hadoop RPMs and DEBs, so that you can manage and maintain your Hadoop cluster. Bigtop provides an integrated smoke testing framework, alongside a suite of over 50 test files. Bigtop provides vagrant recipes, raw images, and (work-in-progress) docker recipes for deploying Hadoop from zero. Bigtop support many Operating Systems, including Debian, Ubuntu, CentOS, Fedora, openSUSE and many others. Bigtop includes tools and a framework for testing at various levels (packaging, platform, runtime, etc.) for both initial deployments as well as upgrade scenarios for the entire data platform, not just the individual components. -
16
Apache Accumulo
Apache Corporation
With Apache Accumulo, users can store and manage large data sets across a cluster. Accumulo uses Apache Hadoop's HDFS to store its data and Apache ZooKeeper for consensus. While many users interact directly with Accumulo, several open source projects use Accumulo as their underlying store. To learn more about Accumulo, take the Accumulo tour, read the user manual and run the Accumulo example code. Feel free to contact us if you have any questions. Accumulo has a programming mechanism (called Iterators) that can modify key/value pairs at various points in the data management process. Every Accumulo key/value pair has its own security label which limits query results based off user authorizations. Accumulo runs on a cluster using one or more HDFS instances. Nodes can be added or removed as the amount of data stored in Accumulo changes. -
17
Yandex Data Proc
Yandex
You select the size of the cluster, node capacity, and a set of services, and Yandex Data Proc automatically creates and configures Spark and Hadoop clusters and other components. Collaborate by using Zeppelin notebooks and other web apps via a UI proxy. You get full control of your cluster with root permissions for each VM. Install your own applications and libraries on running clusters without having to restart them. Yandex Data Proc uses instance groups to automatically increase or decrease computing resources of compute subclusters based on CPU usage indicators. Data Proc allows you to create managed Hive clusters, which can reduce the probability of failures and losses caused by metadata unavailability. Save time on building ETL pipelines and pipelines for training and developing models, as well as describing other iterative tasks. The Data Proc operator is already built into Apache Airflow.Starting Price: $0.19 per hour -
18
MLlib
Apache Software Foundation
Apache Spark's MLlib is a scalable machine learning library that integrates seamlessly with Spark's APIs, supporting Java, Scala, Python, and R. It offers a comprehensive suite of algorithms and utilities, including classification, regression, clustering, collaborative filtering, and tools for constructing machine learning pipelines. MLlib's high-quality algorithms leverage Spark's iterative computation capabilities, delivering performance up to 100 times faster than traditional MapReduce implementations. It is designed to operate across diverse environments, running on Hadoop, Apache Mesos, Kubernetes, standalone clusters, or in the cloud, and accessing various data sources such as HDFS, HBase, and local files. This flexibility makes MLlib a robust solution for scalable and efficient machine learning tasks within the Apache Spark ecosystem. -
19
AuthControl Sentry
Swivel Secure
Deployed in over 54 countries and implemented across enterprises including finance, government, healthcare, education, and manufacturing, AuthControl Sentry® provides organisations with true multi-factor authentication (MFA). It delivers an intelligent solution to prevent unauthorised access to applications and data. AuthControl Sentry® has the flexibility to support a range of architectural requirements and the ability to ensure maximum adoption, thanks to its variety of authentication factors. Patented PINsafe® technology for ultimate security. Supports on-premise and cloud for changeable architecture. A single tenancy and single-tiered cloud solution ensures optimised customization. Risk-based authentication and single sign-on as standard. Integrates seamlessly with hundreds of applications. Ensures maximum adoption with an extensive range of authenticators. -
20
Performance Sentry
Demand Technology Software
Performance Sentry was engineered from the ground up to monitor Windows Server performance and pinpoint application bottlenecks. Performance Sentry gathers mountains of Windows performance data across thousands of enterprise servers and reports only the most meaningful metrics. So you can solve performance problems before they impact your users. When you combine Performance Sentry’s intelligent data collection capabilities and easy administration tools, its full-function Microsoft SQL Server-based performance database, and comprehensive visibility and reporting solutions, you have the ability to take control of your critical Windows Servers and applications like never before. Scale performance monitoring effortlessly to hundreds or even thousands of machines. Performance Sentry is based on intelligent data collection agents deployed across every Windows Server in your environment. -
21
Deeplearning4j
Deeplearning4j
DL4J takes advantage of the latest distributed computing frameworks including Apache Spark and Hadoop to accelerate training. On multi-GPUs, it is equal to Caffe in performance. The libraries are completely open-source, Apache 2.0, and maintained by the developer community and Konduit team. Deeplearning4j is written in Java and is compatible with any JVM language, such as Scala, Clojure, or Kotlin. The underlying computations are written in C, C++, and Cuda. Keras will serve as the Python API. Eclipse Deeplearning4j is the first commercial-grade, open-source, distributed deep-learning library written for Java and Scala. Integrated with Hadoop and Apache Spark, DL4J brings AI to business environments for use on distributed GPUs and CPUs. There are a lot of parameters to adjust when you're training a deep-learning network. We've done our best to explain them, so that Deeplearning4j can serve as a DIY tool for Java, Scala, Clojure, and Kotlin programmers. -
22
Apache Atlas
Apache Software Foundation
Atlas is a scalable and extensible set of core foundational governance services – enabling enterprises to effectively and efficiently meet their compliance requirements within Hadoop and allows integration with the whole enterprise data ecosystem. Apache Atlas provides open metadata management and governance capabilities for organizations to build a catalog of their data assets, classify and govern these assets and provide collaboration capabilities around these data assets for data scientists, analysts and the data governance team. Pre-defined types for various Hadoop and non-Hadoop metadata. Ability to define new types for the metadata to be managed. Types can have primitive attributes, complex attributes, object references; can inherit from other types. Instances of types, called entities, capture metadata object details and their relationships. REST APIs to work with types and instances allow easier integration. -
23
DarkSentry
SentryBay
SentryBay can provide you with a range of services designed to provide real time threat intelligence and alerts to keep you ahead of cybersecurity risks. DarkSentry aggregates public, deep and dark web data across specific geographical locations to deliver localised, sector-targeted or individual enterprise-targeted information enabling vital cybersecurity decisions to be made. This includes the ability to point scanners to specific relevant data sources and filter results and combine credential and data scanning with SentryBay endpoint software to reinforce the use of remote access, corporate and SaaS applications. The DarkSentry service helps you to meet multiple compliance requirements including NIST, GDPR and PCI. -
24
Apache Kylin
Apache Software Foundation
Apache Kylin™ is an open source, distributed Analytical Data Warehouse for Big Data; it was designed to provide OLAP (Online Analytical Processing) capability in the big data era. By renovating the multi-dimensional cube and precalculation technology on Hadoop and Spark, Kylin is able to achieve near constant query speed regardless of the ever-growing data volume. Reducing query latency from minutes to sub-second, Kylin brings online analytics back to big data. Kylin can analyze 10+ billions of rows in less than a second. No more waiting on reports for critical decisions. Kylin connects data on Hadoop to BI tools like Tableau, PowerBI/Excel, MSTR, QlikSense, Hue and SuperSet, making the BI on Hadoop faster than ever. As an Analytical Data Warehouse, Kylin offers ANSI SQL on Hadoop/Spark and supports most ANSI SQL query functions. Kylin can support thousands of interactive queries at the same time, thanks to the low resource consumption of each query. -
25
Apache Storm
Apache Software Foundation
Apache Storm is a free and open source distributed realtime computation system. Apache Storm makes it easy to reliably process unbounded streams of data, doing for realtime processing what Hadoop did for batch processing. Apache Storm is simple, can be used with any programming language, and is a lot of fun to use! Apache Storm has many use cases: realtime analytics, online machine learning, continuous computation, distributed RPC, ETL, and more. Apache Storm is fast: a benchmark clocked it at over a million tuples processed per second per node. It is scalable, fault-tolerant, guarantees your data will be processed, and is easy to set up and operate. Apache Storm integrates with the queueing and database technologies you already use. An Apache Storm topology consumes streams of data and processes those streams in arbitrarily complex ways, repartitioning the streams between each stage of the computation however needed. Read more in the tutorial. -
26
Sentri
Sentri
Sentri is a robust security platform, which is a perfect blend of information, technology and infrastructure. You dreamt of a product that’s intuitive, smart & applicable at all levels of users? Implementation of an identity solution in an organization, to thwart cyber-attacks involves shelling out for licensing, hardware & resources. Here’s where SENTRI brings a cost effective and an efficient suite of access governance & control solutions. Sentri is an one-stop solution for all of your access governance needs, to enable organizations to manage their access rights while , keeping their data secure, both of Cloud and On Premise. We are here to empower you with speedy response seamless self-service and streamlined support, to your satisfaction. Sentri is a one-stop solution to all your IAG (Identity Access Governance), IRM (Integrated Risk Management) and GRC (Governance Risk Compliance) requirements. -
27
SQL Sentry
SolarWinds
Stop wasting time trying to fix SQL Server performance problems. Are you constantly fighting database performance fires, looking in vain for the root cause of SQL Server slowdowns? Without the right information, you could waste valuable time looking in the wrong places for the answers to your performance problems. You need accurate, actionable, detailed metrics to quickly identify and address database problems. With SQL Sentry, you can effectively monitor, diagnose, and optimize your entire database environment. SQL Sentry helps you get out of fire-fighting mode so you can keep your databases running continuously at peak performance. SQL Sentry gives you the level of detail you need to find and fix SQL Server performance problems. SQL Sentry is the flagship product in the SentryOne monitoring solutions set, and was built by SQL Server experts to help you save time and frustration in troubleshooting database performance problems.Starting Price: $1,628 -
28
SentryFusion
Aculab
SentryFusion allows a more secure, multi-factor analysis for access control to critical resources and areas. SentryFusion includes a cluster-based architecture that provides effective scalability, robustness, and future-proofing, along with the option of hosting on-premise or in a data center. Identify a user's voice and face during a video call so that they may later be recognized during a voice conversation, video call, or image. With identity theft on the rise, MFA is increasingly used to prevent unauthorized access to customer data or financial resources. This opens up the possibility of better-than-real-time operation, even in large-scale authentication scenarios. SentryFusion provides near-instantaneous results, streamlining the authentication process and removing the unnecessary hassle for end users. -
29
FaiSentry
Aculab
FaiSentry includes a cluster-based architecture that provides effective scalability, robustness, and future-proofing, along with the option of hosting on-premise or in a data center. Going beyond passwordless login, FaiSentry enables efficient and frictionless identification of multiple individuals from a single image, with results returned within a fraction of a second. Our facial biometric engine combines enterprise-grade security and ease of use, creating the optimal business and client experience. In contrast to other face authentication products on the market, Aculab, combined with AI-driven technology, has created a solution impervious to race and gender biases. A single camera can monitor key entry and exit points, with FaiSentry identifying multiple individuals simultaneously from each image. -
30
Club Sentry
Club Sentry Software
Club Sentry is an on-premise based member management software solution that caters to a variety of clubs, including spas, gyms, health clubs, and pool clubs. Powerful and easy to use, Club Sentry gives users the ability to take control of their facilities and manage their day-to-day operations seamlessly. Key features of Club Sentry include check-in / check-out, member profiling, facility access control, electronic payments, prospect tracking, email generation, scheduling, and reporting. Aside from these powerful set of features, Club Sentry offers three core modules (point of sale module, billing module, and scheduling module) that help enhance club management.Starting Price: $295 one-time payment -
31
Sentry AI
Sentry AI
Double your monitoring agent and guard productivity with Deep Learning Video Analytics. No need to buy new expensive cameras. Sentry AI works with most cameras on the market through SMTP connection. Enables your camera with the latest AI capabilities: person/vehicle detection, facial recognition, license plate recognition and more. Gather more insights with daily summary and custom reports. Leveraging latest AI technologies, specifically deep learning, Sentry AI reduces 99% of false alerts while minimizing missed events. Sentry AI is trained to work with security cameras in sub-optimal conditions. The core design of our product is based on security and safety use cases. Sentry AI optimizes performance down to the camera level by adjusting algorithms continuously, based on user feedback and self-learning. -
32
SentryLogin
Sentry Login
Since 2001 Sentry has been the #1 Member System for Squarespace, Weebly, WordPress and more. Easy to install paywall and password protection for Weebly, Squarespace, Yola, Blogger, WordPress and more. Sentry is easy: it was created with non-programmers in mind. All login form and protection code is provided so all you have to do is Copy, Paste, and Publish. The built-in Sentry Integration Wizard guides you in setting up your subscription plans and takes you through the installation, too. Sentry is easy, but if you ever get stuck, our friendly staff responds quickly to your email help requests... for life! No one has better or faster support. Using the Header / Footer (skin) tools, you can easily set the appearance of Sentry's forms and pages to look more like your site. Better yet, let us create your skin branding for you! The service is free.Starting Price: $4.95 per month -
33
Apache HBase
The Apache Software Foundation
Use Apache HBase™ when you need random, realtime read/write access to your Big Data. This project's goal is the hosting of very large tables -- billions of rows X millions of columns -- atop clusters of commodity hardware. Automatic failover support between RegionServers. Easy to use Java API for client access. Thrift gateway and a REST-ful Web service that supports XML, Protobuf, and binary data encoding options. Support for exporting metrics via the Hadoop metrics subsystem to files or Ganglia; or via JMX. -
34
Amazon EMR
Amazon
Amazon EMR is the industry-leading cloud big data platform for processing vast amounts of data using open-source tools such as Apache Spark, Apache Hive, Apache HBase, Apache Flink, Apache Hudi, and Presto. With EMR you can run Petabyte-scale analysis at less than half of the cost of traditional on-premises solutions and over 3x faster than standard Apache Spark. For short-running jobs, you can spin up and spin down clusters and pay per second for the instances used. For long-running workloads, you can create highly available clusters that automatically scale to meet demand. If you have existing on-premises deployments of open-source tools such as Apache Spark and Apache Hive, you can also run EMR clusters on AWS Outposts. Analyze data using open-source ML frameworks such as Apache Spark MLlib, TensorFlow, and Apache MXNet. Connect to Amazon SageMaker Studio for large-scale model training, analysis, and reporting. -
35
ThreatSentry
Privacyware
Don't sweat unaddressed vulnerabilities, insider misuse, or new types of attacks. ThreatSentry combines a state-of-the-art Web Application Firewall and port-level firewall with advanced behavioral filtering to block unwanted IIS traffic and web application threats. ThreatSentry delivers enterprise-grade, multi-layered protection and compliance (i.e. PCI DSS) for Microsoft IIS (5/6/7/8/10) at a small-business price! Implemented as a native module in IIS7 through 10 (or ISAPI extension or filter in IIS 6 and IIS 5 respectively), and Snap-in to the Microsoft Management Console (MMC), ThreatSentry is exceptionally easy to use and designed to protect network weak points created by lapses in patch management, configuration errors, and the use of new and progressive attack techniques. Take advantage of a free ThreatSentry evaluation session today! We'll guide you one-on-one through installation and configuration. Click here to schedule.Starting Price: $649.00 -
36
Tencent Cloud Elastic MapReduce
Tencent
EMR enables you to scale the managed Hadoop clusters manually or automatically according to your business curves or monitoring metrics. EMR's storage-computation separation even allows you to terminate a cluster to maximize resource efficiency. EMR supports hot failover for CBS-based nodes. It features a primary/secondary disaster recovery mechanism where the secondary node starts within seconds when the primary node fails, ensuring the high availability of big data services. The metadata of its components such as Hive supports remote disaster recovery. Computation-storage separation ensures high data persistence for COS data storage. EMR is equipped with a comprehensive monitoring system that helps you quickly identify and locate cluster exceptions to ensure stable cluster operations. VPCs provide a convenient network isolation method that facilitates your network policy planning for managed Hadoop clusters. -
37
Apache Giraph
Apache Software Foundation
Apache Giraph is an iterative graph processing system built for high scalability. For example, it is currently used at Facebook to analyze the social graph formed by users and their connections. Giraph originated as the open-source counterpart to Pregel, the graph processing architecture developed at Google and described in a 2010 paper. Both systems are inspired by the Bulk Synchronous Parallel model of distributed computation introduced by Leslie Valiant. Giraph adds several features beyond the basic Pregel model, including master computation, sharded aggregators, edge-oriented input, out-of-core computation, and more. With a steady development cycle and a growing community of users worldwide, Giraph is a natural choice for unleashing the potential of structured datasets at a massive scale. Apache Giraph is an iterative graph processing framework, built on top of Apache Hadoop. -
38
Sentry
Sentry
From error tracking to performance monitoring, developers can see what actually matters, solve quicker, and learn continuously about their applications - from the frontend to the backend. With Sentry’s performance monitoring you can trace performance issues to poor-performing api calls and slow database queries. Source code, error filters, stack locals — Sentry enhances application performance monitoring with stack traces. Quickly identify performance issues before they become downtime. View the entire end-to-end distributed trace to see the exact, poor-performing API call and surface any related errors. Breadcrumbs make application development a little easier by showing you the trails of events that lead to the error(s).Starting Price: $26 per month -
39
Driver Sentry
TECHVISTA Co. Ltd.
Based on a comprehensive collection of millions of drivers, Driver Sentry offers a powerful and intelligent solution to diagnose and repair both software and hardware problems on computers. By analyzing the system's configuration and identifying any outdated, missing, or incompatible drivers, Driver Sentry can automatically recommend the most appropriate fixes. This includes updating drivers, resolving conflicts, and offering step-by-step guidance for troubleshooting and repair. What sets Driver Sentry apart is its efficiency. The system operates with minimal performance impact, ensuring that users experience little to no slowdown while it scans and repairs their systems. It is also optimized for low storage usage, making it a highly resource-efficient tool that can be run on systems with limited disk space. Whether you're a casual user or a tech professional, Driver Sentry offers an easy and efficient way to keep your computer running smoothly without using too many resources.Starting Price: $10.98 -
40
IPSentry
RGE
ipSentry is a Windows-based network monitoring software package used by thousands of information system specialists, system administrators, and IT solution providers around the world. When you purchase ipSentry network monitoring software, you are buying a powerful network administration tool which will continuously monitor your internet and intranet servers, routers, modems, databases, services, event logs, and more, 24 hours per day; insuring that your network and devices are functioning properly. If a problem is detected, various alerts, notifications, and actions can be triggered to make sure you are aware of the problem as soon as possible. Like thousands of IT professionals around the world, use ipSentry to stay apprised of potential network issues and keep your network systems, servers, and other devices running smoothly. Download the fully functional 21-day evaluation of ipSentry Network Monitoring Suite.Starting Price: $199 one-time payment -
41
Apache Eagle
Apache Software Foundation
Apache Eagle (called Eagle in the following) is an open source analytics solution for identifying security and performance issues instantly on big data platforms, e.g. Apache Hadoop, Apache Spark etc. It analyzes data activities, yarn applications, jmx metrics, and daemon logs etc., provides state-of-the-art alert engine to identify security breach, performance issues and shows insights. Big data platform normally generates huge amount of operational logs and metrics in realtime. Eagle is founded to solve hard problems in securing and tuning performance for big data platforms by ensuring metrics, logs always available and alerting immediately even under huge traffic. Streaming operational logs and data activities into Eagle platform, including but not limited to audit logs, map/reduce jobs, yarn resource usage, jmx metrics and various daemon logs etc. Generate alerts, show historical trend, and correlate alert with raw data. -
42
Password Sentry
Password Sentry
Password Sentry (PS) is a website password protection enterprise software application that monitors logins to detect and block password sharing. PS employs cutting edge technology to block dictionary and brute force attacks: stop hackers from guessing passwords. Password Sentry is NOT an IP counter application. Password Sentry counts unique logins using geographical metrics. PS analyzes logins using PS::GeoTracking technology. Each user is geographically profiled. Their exact location is derived from their IP address: City, Region, Country, and Coordinates (Latitude and Longitude). User logins are then mapped, and the distance between logins analyzed for any given user. If a login is mapped outside the acceptable radius threshold (measured in miles, and defined via Control Panel Preferences), the user is suspended. This algorithm ensures that false positives and false negatives are negligible.Starting Price: $99.95 one-time payment -
43
BigBI
BigBI
BigBI enables data specialists to build their own powerful big data pipelines interactively & efficiently, without any coding! BigBI unleashes the power of Apache Spark enabling: Scalable processing of real Big Data (up to 100X faster) Integration of traditional data (SQL, batch files) with modern data sources including semi-structured (JSON, NoSQL DBs, Elastic, Hadoop), and unstructured (Text, Audio, video), Integration of streaming data, cloud data, AI/ML & graphs -
44
Gate Sentry
Gate Sentry
Gate Sentry Visitor Management Software Gate Sentry is a modern visitor management system built for properties with on-site security, including gated communities, country clubs, and manufacturing sites. It replaces outdated tools like desktops, scanners, and paper logs with a single secure tablet, streamlining operations at the gate. Users can easily update guest lists and send secure VIP passes, with all information syncing instantly to the security tablet. Security teams can view real-time guest lists, scan mobile passes, and log entries with just a few taps—no paperwork, no delays. Whether managing daily visitors, contractors, or event guests, Gate Sentry simplifies access control while boosting security and accountability across your property. -
45
CoverSentry
CoverSentry
CoverSentry is a privacy-focused platform that helps job seekers create stronger, more natural-sounding cover letters. Our goal is to ensure they won’t be flagged as AI-written. Whether you're starting from scratch or improving a draft, CoverSentry offers intelligent tools with a simple interface to support your job application process. Free AI Detection for Cover Letters Our free analysis tool lets you upload or paste your letter and instantly see how robotic or AI-generated it sounds. Trained on thousands of real applications, our model gives fast, visual results. No signup, no data storage. Privacy by Design The analysis and cover letter check is processed locally. Your content stays private and isn’t stored or shared. No weird cookies and no tracking. For All Job Seekers Whether it’s your first job or a major career move, CoverSentry helps you sound confident, human, and ready.Starting Price: $0 -
46
Sentry Wallet
Sentry Wallet
Sentry is an AI-protected multi-chain wallet designed to place an intelligent firewall in front of every transaction. It analyzes sends, swaps, and approvals in real time to detect phishing attempts, wallet drainers, and unsafe dApps before funds are compromised. The wallet is non-custodial, keeping private keys encrypted locally on the user’s device for maximum control and security. SentryAI provides clear risk scoring and plain-language explanations so users understand exactly what they are signing. Features like anomaly detection and an Emergency Panic Vault add additional protection when suspicious activity is detected. The wallet supports multiple networks, including EVM chains, Bitcoin, and Solana, with more integrations planned. By combining AI insights with privacy-first architecture, Sentry helps users transact confidently across blockchains.Starting Price: $10/month/user -
47
Work Sentry
Little Beak Private Limited
Work Sentry is a professional work tracking and management system designed to monitor and streamline employee productivity. It provides features such as time tracking, idle time monitoring, manual time entry, attendance management, project allocation, task tracking, screenshot capture, and detailed productivity reports. With an easy-to-use dashboard and real-time insights, Work Sentry helps businesses ensure accountability, improve efficiency, and manage remote or in-office teams effectively. -
48
SentryFile
CutCom Software
Sentry File allows you to integrate paper documents and electronic documents into an online filing system. It has all the tools that today's digital office demands, in a single, web-based package. Quickly create a complete digital library of all your important business documents. Easily integrate your paper documents by using any Twain, Scan-To-Email, Scan-To-FTP or Scan-To-Folder compatible scanning devices. Upload electronic files such as Microsoft Office, Audio, Video and virtually any other file format. Sentry File simplifies management with an ultra-intuitive graphical user interface. Professional and Small Business Editions excel at meeting the needs of small and midsize businesses that want to protect valuable paper-based documentation at an affordable price. The highly scalable Sentry File Corporate and Enterprise Editions are ideal for large organizations that want a simple and effective way to distribute documentation across the office, or across the world. -
49
Oracle Big Data Service
Oracle
Oracle Big Data Service makes it easy for customers to deploy Hadoop clusters of all sizes, with VM shapes ranging from 1 OCPU to a dedicated bare metal environment. Customers choose between high-performance NVmE storage or cost-effective block storage, and can grow or shrink their clusters. Quickly create Hadoop-based data lakes to extend or complement customer data warehouses, and ensure that all data is both accessible and managed cost-effectively. Query, visualize and transform data so data scientists can build machine learning models using the included notebook with its R, Python and SQL support. Move customer-managed Hadoop clusters to a fully-managed cloud-based service, reducing management costs and improving resource utilization.Starting Price: $0.1344 per hour -
50
Load your data into or out of Hadoop and data lakes. Prep it so it's ready for reports, visualizations or advanced analytics – all inside the data lakes. And do it all yourself, quickly and easily. Makes it easy to access, transform and manage data stored in Hadoop or data lakes with a web-based interface that reduces training requirements. Built from the ground up to manage big data on Hadoop or in data lakes; not repurposed from existing IT-focused tools. Lets you group multiple directives to run simultaneously or one after the other. Schedule and automate directives using the exposed Public API. Enables you to share and secure directives. Call them from SAS Data Integration Studio, uniting technical and nontechnical user activities. Includes built-in directives – casing, gender and pattern analysis, field extraction, match-merge and cluster-survive. Profiling runs in-parallel on the Hadoop cluster for better performance.