Alternatives to DPR
Compare DPR alternatives for your business or organization using the curated list below. SourceForge ranks the best alternatives to DPR in 2026. Compare features, ratings, user reviews, pricing, and more from DPR competitors and alternatives in order to make an informed decision for your business.
-
1
dbt
dbt Labs
dbt helps data teams transform raw data into trusted, analysis-ready datasets faster. With dbt, data analysts and data engineers can collaborate on version-controlled SQL models, enforce testing and documentation standards, lean on detailed metadata to troubleshoot and optimize pipelines, and deploy transformations reliably at scale. Built on modern software engineering best practices, dbt brings transparency and governance to every step of the data transformation workflow. Thousands of companies, from startups to Fortune 500 enterprises, rely on dbt to improve data quality and trust as well as drive efficiencies and reduce costs as they deliver AI-ready data across their organization. Whether you’re scaling data operations or just getting started, dbt empowers your team to move from raw data to actionable analytics with confidence. -
2
Trifacta
Trifacta
The fastest way to prep data and build data pipelines in the cloud. Trifacta provides visual and intelligent guidance to accelerate data preparation so you can get to insights faster. Poor data quality can sink any analytics project. Trifacta helps you understand your data so you can quickly and accurately clean it up. All the power with none of the code. Trifacta provides visual and intelligent guidance so you can get to insights faster. Manual, repetitive data preparation processes don’t scale. Trifacta helps you build, deploy and manage self-service data pipelines in minutes not months. -
3
Tableau Prep
Salesforce
Tableau Prep changes the way traditional data prep is performed in an organization. By providing a visual and direct way to combine, shape and clean data, Tableau Prep makes it easier for analysts and business users to start their analysis, faster. Tableau Prep is comprised of two products: Tableau Prep Builder for building your data flows, and Tableau Prep Conductor for scheduling, monitoring and managing flows across the organization. Three coordinated views let you see row-level data, profiles of each column, and your entire data preparation process. Pick which view to interact with based on the task at hand. If you want to edit a value, you select and directly edit. Change your join type, and see the result right away. With each action, you instantly see your data change, even on millions of rows of data. Tableau Prep Builder gives you the freedom to re-order steps and experiment without consequence.Starting Price: $70 per user per month -
4
Zoho DataPrep
Zoho
Zoho DataPrep (Best ETL tool in 2025) is an AI-powered, advanced self-service data preparation software that helps organisations prepare large volumes of data. As a no-code ETL platform, it eliminates the need for complex coding, making data preparation accessible to users of all backgrounds. A standout feature is the ability to create entire ETL pipelines using Ask Zia; simply describe your data preparation needs in plain English, and our conversational AI will build the pipeline for you. Data can be imported from over 80 sources, and DataPrep can automatically identify errors, discover data patterns, and transform and enrich data without requiring coding. You can also set up automated export schedules to your preferred data destination. DataPrep also helps catalogue data and set up ETL pipelines to sync the prepared data to Zoho Analytics and data warehouses, among many other destinations.Starting Price: $40 per month -
5
Verodat
Verodat
Verodat is a SaaS platform that gathers, prepares, enriches and connects your business data to AI Analytics tools. For outcomes you can trust. Verodat automates data cleansing & consolidates data into a clean, trustworthy data layer to feed downstream reporting. Manages data requests to suppliers. Monitors the data workflow to identify bottlenecks & resolve issues. Generates an audit trail to evidence quality assurance for every data row. Customize validation & governance to suit your organization. Reduces data prep time by 60%, allowing data analysts to focus on insights. The central KPI Dashboard reports key metrics on your data pipeline, allowing you to identify bottlenecks, resolve issues and improve performance. The flexible rules engine allows users to easily create validation and testing to suit your organization's needs. With out of the box connections to Snowflake, Azure and other cloud systems, it's easy to integrate with your existing tools. -
6
Pantomath
Pantomath
Organizations continuously strive to be more data-driven, building dashboards, analytics, and data pipelines across the modern data stack. Unfortunately, most organizations struggle with data reliability issues leading to poor business decisions and lack of trust in data as an organization, directly impacting their bottom line. Resolving complex data issues is a manual and time-consuming process involving multiple teams all relying on tribal knowledge to manually reverse engineer complex data pipelines across different platforms to identify root-cause and understand the impact. Pantomath is a data pipeline observability and traceability platform for automating data operations. It continuously monitors datasets and jobs across the enterprise data ecosystem providing context to complex data pipelines by creating automated cross-platform technical pipeline lineage. -
7
DataKitchen
DataKitchen
Reclaim control of your data pipelines and deliver value instantly, without errors. The DataKitchen™ DataOps platform automates and coordinates all the people, tools, and environments in your entire data analytics organization – everything from orchestration, testing, and monitoring to development and deployment. You’ve already got the tools you need. Our platform automatically orchestrates your end-to-end multi-tool, multi-environment pipelines – from data access to value delivery. Catch embarrassing and costly errors before they reach the end-user by adding any number of automated tests at every node in your development and production pipelines. Spin-up repeatable work environments in minutes to enable teams to make changes and experiment – without breaking production. Fearlessly deploy new features into production with the push of a button. Free your teams from tedious, manual work that impedes innovation. -
8
MassFeeds
Mass Analytics
MassFeeds is a specialized data preparation tool. It allows to automatically and quickly prepare data presenting multiple formats and coming from various sources. It is designed to accelerate and facilitate the data prep process through the creation of automated data pipelines for your marketing mix model. Data is being created and collected at an increasing pace and organizations cannot expect heavy manual data preparation processes to scale. MassFeeds help clients prepare data collected from various sources and present multiple formats using a seamless, automated, and easy-to-tweak process. Using MassFeeds’ pipeline of processors, data is structured into a standard format that can easily be ingested for modeling. Avoid manual data preparation which is prone to human errors. Make data processing accessible to a wider spectrum of users. Save more than 40% in processing time by automating repetitive tasks. -
9
AWS Data Pipeline
Amazon
AWS Data Pipeline is a web service that helps you reliably process and move data between different AWS compute and storage services, as well as on-premises data sources, at specified intervals. With AWS Data Pipeline, you can regularly access your data where it’s stored, transform and process it at scale, and efficiently transfer the results to AWS services such as Amazon S3, Amazon RDS, Amazon DynamoDB, and Amazon EMR. AWS Data Pipeline helps you easily create complex data processing workloads that are fault tolerant, repeatable, and highly available. You don’t have to worry about ensuring resource availability, managing inter-task dependencies, retrying transient failures or timeouts in individual tasks, or creating a failure notification system. AWS Data Pipeline also allows you to move and process data that was previously locked up in on-premises data silos.Starting Price: $1 per month -
10
LearnQ.ai
LearnQ.ai
Start your journey with a personalized study plan to reach your dream SAT score with AI precision. A shorter, yet deeper AI-driven test coupled with SAT score calculator that gauges students' understanding, setting the groundwork for Digital SAT prep. Topic-wise practice tests that are engaging, mobile-friendly, AI-driven game-based learning modules that make Digital SAT prep enjoyable and effective. Realistic Digital SAT practice tests that mirror the College Board's format, ensuring students are exam-ready. Harnessing the power of AI, Mia offers personalized guidance, ensuring each Digital SAT student's prep journey is tailored to their unique needs. Our analytics platform equips educators with instant visibility into student progress, facilitating prompt and effective assistance where needed. Our advanced AI identifies and suggests areas for improvement, guiding teachers and admins toward data-driven teaching.Starting Price: $39 one-time payment -
11
Google Cloud Dataflow
Google
Unified stream and batch data processing that's serverless, fast, and cost-effective. Fully managed data processing service. Automated provisioning and management of processing resources. Horizontal autoscaling of worker resources to maximize resource utilization. OSS community-driven innovation with Apache Beam SDK. Reliable and consistent exactly-once processing. Streaming data analytics with speed. Dataflow enables fast, simplified streaming data pipeline development with lower data latency. Allow teams to focus on programming instead of managing server clusters as Dataflow’s serverless approach removes operational overhead from data engineering workloads. Allow teams to focus on programming instead of managing server clusters as Dataflow’s serverless approach removes operational overhead from data engineering workloads. Dataflow automates provisioning and management of processing resources to minimize latency and maximize utilization. -
12
Openbridge
Openbridge
Uncover insights to supercharge sales growth using code-free, fully-automated data pipelines to data lakes or cloud warehouses. A flexible, standards-based platform to unify sales and marketing data for automating insights and smarter growth. Say goodbye to messy, expensive manual data downloads. Always know what you’ll pay and only pay for what you use. Fuel your tools with quick access to analytics-ready data. As certified developers, we only work with secure, official APIs. Get started quickly with data pipelines from popular sources. Pre-built, pre-transformed, and ready-to-go data pipelines. Unlock data from Amazon Vendor Central, Amazon Seller Central, Instagram Stories, Facebook, Amazon Advertising, Google Ads, and many others. Code-free data ingestion and transformation processes allow teams to realize value from their data quickly and cost-effectively. Data is always securely stored directly in a trusted, customer-owned data destination like Databricks, Amazon Redshift, etc.Starting Price: $149 per month -
13
GradesAI
GradesAI
GradesAI uses artificial intelligence to create personalized study plans and predictive practice tests that are based on your syllabus, class outline, historical data, and 100s of other data points. We then feed all the data into our GSAI algorithm to create exam prep tailored to your learning style, goals, and feedback. GradesAI will generate your outline, class notes, predictive practice exams, flashcards, and all the exam prep you need to get the best grade possible. A friendly and supportive community of students and tutors who can help you with any questions or challenges. Free access to all GradesAI tools and, for those who want to accelerate success, flexible subscription plans that fit your budget and needs. Monitor your progress with visual displays of your scores and performance on previous tests. Access your notes, flashcards, tests, essays, and your own uploaded material.Starting Price: $19.99/month -
14
Dagster
Dagster Labs
Dagster is a next-generation orchestration platform for the development, production, and observation of data assets. Unlike other data orchestration solutions, Dagster provides you with an end-to-end development lifecycle. Dagster gives you control over your disparate data tools and empowers you to build, test, deploy, run, and iterate on your data pipelines. It makes you and your data teams more productive, your operations more robust, and puts you in complete control of your data processes as you scale. Dagster brings a declarative approach to the engineering of data pipelines. Your team defines the data assets required, quickly assessing their status and resolving any discrepancies. An assets-based model is clearer than a tasks-based one and becomes a unifying abstraction across the whole workflow.Starting Price: $0 -
15
CloverDX
CloverDX
Design, debug, run and troubleshoot data transformations and jobflows in a developer-friendly visual designer. Orchestrate data workloads that require tasks to be carried out in the right sequence, orchestrate multiple systems with the transparency of visual workflows. Deploy data workloads easily into a robust enterprise runtime environment. In cloud or on-premise. Make data available to people, applications and storage under a single unified platform. Manage your data workloads and related processes together in a single platform. No task is too complex. We’ve built CloverDX on years of experience with large enterprise projects. Developer-friendly open architecture and flexibility lets you package and hide the complexity for non-technical users. Manage the entire lifecycle of a data pipeline from design, deployment to evolution and testing. Get things done fast with the help of our in-house customer success teams.Starting Price: $5000.00/one-time -
16
BenchPrep
BenchPrep
BenchPrep is a configurable cloud-based learning platform that delivers the best learning experience and drives revenue for nonprofits (credentialing bodies & associations), corporations, and training companies. With an award-winning learner-centric platform, BenchPrep increases learner engagement, improves long-term learner retention, and reduces dropout rates. BenchPrep Ascend increases operating revenue and reduces expenses by enabling learning organizations to support multiple business models and streamline the delivery of online courses. By creating a personalized experience that improves knowledge retention and drives better outcomes, BenchPrep Ascend amplifies the value of your learning program in a highly competitive market. -
17
RapidMiner
Altair
RapidMiner is reinventing enterprise AI so that anyone has the power to positively shape the future. We’re doing this by enabling ‘data loving’ people of all skill levels, across the enterprise, to rapidly create and operate AI solutions to drive immediate business impact. We offer an end-to-end platform that unifies data prep, machine learning, and model operations with a user experience that provides depth for data scientists and simplifies complex tasks for everyone else. Our Center of Excellence methodology and the RapidMiner Academy ensures customers are successful, no matter their experience or resource levels. Simplify operations, no matter how complex models are, or how they were created. Deploy, evaluate, compare, monitor, manage and swap any model. Solve your business issues faster with sharper insights and predictive models, no one understands the business problem like you do.Starting Price: Free -
18
ConVista ConsPrep
Convista
Everyone is currently talking about SAP S/4HANA for Group Reporting for good reason. The new Group Accounting solution from SAP shows that modern and efficient group reporting is more than "just" consolidation. We have been pursuing this approach - in conjunction with the various consolidation solutions - for well over ten years with our SAP add-on ConVista ConsPrep. This is how we start with the critical activities along the record-to-report process, in order to allow barrier-free data flow and avoiding manual intervention. Our approach - an integrated process from the recording of accounting data in the individual financial statements to the consolidated report. ConVista ConsPrep is a SAP-certified program package. The flexible software architecture combines the advantages of standard software with the possibility to implement customer-specific requirements accordingly. -
19
Talend Pipeline Designer is a web-based self-service application that takes raw data and makes it analytics-ready. Compose reusable pipelines to extract, improve, and transform data from almost any source, then pass it to your choice of data warehouse destinations, where it can serve as the basis for the dashboards that power your business insights. Build and deploy data pipelines in less time. Design and preview, in batch or streaming, directly in your web browser with an easy, visual UI. Scale with native support for the latest hybrid and multi-cloud technologies, and improve productivity with real-time development and debugging. Live preview lets you instantly and visually diagnose issues with your data. Make better decisions faster with dataset documentation, quality proofing, and promotion. Transform data and improve data quality with built-in functions applied across batch or streaming pipelines, turning data health into an effortless, automated discipline.
-
20
Inzata Analytics
Inzata Analytics
Inzata Analytics: An AI-powered, end-to-end data analytics software solution. Inzata takes your raw, unrefined data and transforms it into actionable insights, all on one platform. Build your entire data warehouse in less than one day using Inzata Analytics. Inzata’s library of over 700 data connectors ensures as seamless and hasty data integration process. Our patented aggregation engine promises prepped, blended, and organized data models in seconds. Create automated data pipeline workflows for real-time data analysis updates in Inzata’s newest too, InFlow. Finally, display your business data confidently on 100% customizable interactive dashboards. Realize the power of real-time analytics to supercharge your business agility and responsiveness, with Inzata. -
21
QuerySurge
RTTS
QuerySurge leverages AI to automate the data validation and ETL testing of Big Data, Data Warehouses, Business Intelligence Reports and Enterprise Apps/ERPs with full DevOps functionality for continuous testing. Use Cases - Data Warehouse & ETL Testing - Hadoop & NoSQL Testing - DevOps for Data / Continuous Testing - Data Migration Testing - BI Report Testing - Enterprise App/ERP Testing QuerySurge Features - Projects: Multi-project support - AI: automatically create datas validation tests based on data mappings - Smart Query Wizards: Create tests visually, without writing SQL - Data Quality at Speed: Automate the launch, execution, comparison & see results quickly - Test across 200+ platforms: Data Warehouses, Hadoop & NoSQL lakes, databases, flat files, XML, JSON, BI Reports - DevOps for Data & Continuous Testing: RESTful API with 60+ calls & integration with all mainstream solutions - Data Analytics & Data Intelligence: Analytics dashboard & reports -
22
WinPure MDM
WinPure
WinPure™ MDM is a master data management solution that aligns with your business to achieve a single view of your data with functions and features to help you manage your data. The features are ala-carte from all of the clean & match enterprise edition, repurposed specifically for simple web based data prep, and MDM operations. Data in dozens of different formats, dozens of simple and powerful ways to clean, standardize and to transform data. Industry leading data matching and error-tolerant technologies. Simple and configurable survivorship technology. General benefits include lower cost and faster time to market. Simple to use, minimal training and minimal implementation. Better business outcomes, faster MDM or systems deployment. Faster and more accurate batch loads, simple and accessible data prep tools. Flexible and effective interconnectivity with other internal and external database and systems via API. Faster time to synergies for M&A. -
23
Datazoom
Datazoom
Improving the experience, efficiency, and profitability of streaming video requires data. Datazoom enables video publishers to better operate distributed architectures through centralizing, standardizing, and integrating data in real-time to create a more powerful data pipeline and improve observability, adaptability, and optimization solutions. Datazoom is a video data platform that continually gathers data from endpoints, like a CDN or a video player, through an ecosystem of collectors. Once the data is gathered, it is normalized using standardized data definitions. This data is then sent through available connectors to analytics platforms like Google BigQuery, Google Analytics, and Splunk and can be visualized in tools such as Looker and Superset. Datazoom is your key to a more effective and efficient data pipeline. Get the data you need in real-time. Don’t wait for your data when you need to resolve an issue immediately. -
24
definity
definity
Monitor and control everything your data pipelines do with zero code changes. Monitor data and pipelines in motion to proactively prevent downtime and quickly root cause issues. Optimize pipeline runs and job performance to save costs and keep SLAs. Accelerate code deployments and platform upgrades while maintaining reliability and performance. Data & performance checks in line with pipeline runs. Checks on input data, before pipelines even run. Automatic preemption of runs. definity takes away the effort to build deep end-to-end coverage, so you are protected at every step, across every dimension. definity shifts observability to post-production to achieve ubiquity, increase coverage, and reduce manual effort. definity agents automatically run with every pipeline, with zero footprints. Unified view of data, pipelines, infra, lineage, and code for every data asset. Detect in run-time and avoid async checks. Auto-preempt runs, even on inputs. -
25
Kestra
Kestra
Kestra is an open-source, event-driven orchestrator that simplifies data operations and improves collaboration between engineers and business users. By bringing Infrastructure as Code best practices to data pipelines, Kestra allows you to build reliable workflows and manage them with confidence. Thanks to the declarative YAML interface for defining orchestration logic, everyone who benefits from analytics can participate in the data pipeline creation process. The UI automatically adjusts the YAML definition any time you make changes to a workflow from the UI or via an API call. Therefore, the orchestration logic is defined declaratively in code, even if some workflow components are modified in other ways. -
26
Stripe Data Pipeline
Stripe
Stripe Data Pipeline sends all your up-to-date Stripe data and reports to Snowflake or Amazon Redshift in a few clicks. Centralize your Stripe data with other business data to close your books faster and unlock richer business insights. Set up Stripe Data Pipeline in minutes and automatically receive your Stripe data and reports in your data warehouse on an ongoing basis–no code required. Create a single source of truth to speed up your financial close and access better insights. Identify your best-performing payment methods, analyze fraud by location, and more. Send your Stripe data directly to your data warehouse without involving a third-party extract, transform, and load (ETL) pipeline. Offload ongoing maintenance with a pipeline that’s built into Stripe. No matter how much data you have, your data is always complete and accurate. Automate data delivery at scale, minimize security risks, and avoid data outages and delays.Starting Price: 3¢ per transaction -
27
Qlik Compose
Qlik
Qlik Compose for Data Warehouses provides a modern approach by automating and optimizing data warehouse creation and operation. Qlik Compose automates designing the warehouse, generating ETL code, and quickly applying updates, all whilst leveraging best practices and proven design patterns. Qlik Compose for Data Warehouses dramatically reduces the time, cost and risk of BI projects, whether on-premises or in the cloud. Qlik Compose for Data Lakes automates your data pipelines to create analytics-ready data sets. By automating data ingestion, schema creation, and continual updates, organizations realize faster time-to-value from their existing data lake investments. -
28
RudderStack
RudderStack
RudderStack is the smart customer data pipeline. Easily build pipelines connecting your whole customer data stack, then make them smarter by pulling analysis from your data warehouse to trigger enrichment and activation in customer tools for identity stitching and other advanced use cases. Start building smarter customer data pipelines today.Starting Price: $750/month -
29
Integrate.io
Integrate.io
Unify Your Data Stack: Experience the first no-code data pipeline platform and power enlightened decision making. Integrate.io is the only complete set of data solutions & connectors for easy building and managing of clean, secure data pipelines. Increase your data team's output with all of the simple, powerful tools & connectors you’ll ever need in one no-code data integration platform. Empower any size team to consistently deliver projects on-time & under budget. We ensure your success by partnering with you to truly understand your needs & desired outcomes. Our only goal is to help you overachieve yours. Integrate.io's Platform includes: -No-Code ETL & Reverse ETL: Drag & drop no-code data pipelines with 220+ out-of-the-box data transformations -Easy ELT & CDC :The Fastest Data Replication On The Market -Automated API Generation: Build Automated, Secure APIs in Minutes - Data Warehouse Monitoring: Finally Understand Your Warehouse Spend - FREE Data Observability: Custom -
30
Unsupervised
Unsupervised
Unsupervised automates analytics so revenue teams go straight to why metrics are failing or succeeding, surfacing hidden growth opportunities without endless dashboards. Unsupervised automates the prep and analysis of complex data, discovering the most critical KPIs, and tracking ROI over time. Business and data teams use Unsupervised to drive growth, reduce costs, and mitigate risks without additional data science investment. This is why customers leveraging Unsupervised have already found more than $140 million in opportunities this year alone. -
31
Simplifies data regulation needs, enhances visibility and streamlines monitoring IBM® Guardium® Data Compliance helps organizations to move through regulatory compliance and audit requirements more quickly and easily, safeguarding regulated data wherever it resides. Available in IBM® Guardium® Data Security Center, IBM Guardium Data Compliance can reduce audit prep time for data compliance regulations, provide continuous visibility of data security controls, and solve data compliance and data activity monitoring challenges.
-
32
DataOps.live
DataOps.live
DataOps.live, the Data Products company, delivers productivity and governance breakthroughs for data developers and teams through environment automation, pipeline orchestration, continuous testing and unified observability. We bring agile DevOps automation and a powerful unified cloud Developer Experience (DX) to modern cloud data platforms like Snowflake. DataOps.live, a global cloud-native company, is used by Global 2000 enterprises including Roche Diagnostics and OneWeb to deliver 1000s of Data Product releases per month with the speed and governance the business demands. -
33
Test-Guide.com
Test-Guide.com
All of our practice tests were crafted by experts from their respective fields. Our team has more than 100 years of combined experience in the education industry. Our free practice tests will help prepare you for your exam by asking realistic questions that may appear on your actual exam. Use our detailed answer explanations to learn from your mistakes. We offer practice tests, study materials, and prep course reviews for more than 100 different tests. We offer prep materials for 100+ different tests. We specialize in offering free practice tests. Our free test prep resources cover a wide variety of exams, allowing you to take practice tests for college admissions, grad school admissions, career, intelligence and personality, finance, nursing, driver ed, and more. We continually update our content to ensure our users have the most up-to-date information. We recommend prep courses and products based on in-depth research.Starting Price: $24.99 one-time payment -
34
Data360 Analyze
Precisely
The most successful businesses have common denominators: maximizing organizational efficiencies, mitigating risk, growing revenue and innovating – fast. Data360 Analyze is the fastest way to aggregate and organize large amounts of data to uncover valuable insights across business units. Easily access, prep and analyze quality data through its intuitive browser-based architecture. A solid understanding of your organization’s data landscape can shed light on disparate data sources, missing and outlying values and anomalies in data logic. Accelerate the discovery, validation, transformation and blending of data from across your organization to deliver accurate, relevant and trusted information for analysis. Visual data inspection and lineage allow you to trace and access data at any step within the data flow analytic process to collaborate with other stakeholders and build confidence and trust in the data and insights. -
35
Wolters Kluwer Digital Tax Workflow
Wolters Kluwer
Firms that don’t fully commit to an integrated digital tax workflow solution can find themselves wasting time and leaving clients less than satisfied. It’s inefficient to have some steps that are automated and others that still require shuffling papers. If every part of your tax prep process doesn’t work together, technology can be working against you, not for you. Automated tax prep workflow software creates faster, trackable, end-to-end processes, which save your staff time while also satisfying busy clients. Every step, from obtaining client source documents to reviews and approvals is faster, including the final step—you getting paid! In digital tax preparation, client data is digitally imported, reviews are handled on- screen and returns are signed and filed electronically. By eliminating tedious data entry and other inefficient manual processes, tax workflow automation allows your firm to do more work with less resources. -
36
Azure Event Hubs
Microsoft
Event Hubs is a fully managed, real-time data ingestion service that’s simple, trusted, and scalable. Stream millions of events per second from any source to build dynamic data pipelines and immediately respond to business challenges. Keep processing data during emergencies using the geo-disaster recovery and geo-replication features. Integrate seamlessly with other Azure services to unlock valuable insights. Allow existing Apache Kafka clients and applications to talk to Event Hubs without any code changes—you get a managed Kafka experience without having to manage your own clusters. Experience real-time data ingestion and microbatching on the same stream. Focus on drawing insights from your data instead of managing infrastructure. Build real-time big data pipelines and respond to business challenges right away.Starting Price: $0.03 per hour -
37
Informatica Data Engineering
Informatica
Ingest, prepare, and process data pipelines at scale for AI and analytics in the cloud. Informatica’s comprehensive data engineering portfolio provides everything you need to process and prepare big data engineering workloads to fuel AI and analytics: robust data integration, data quality, streaming, masking, and data preparation capabilities. Rapidly build intelligent data pipelines with CLAIRE®-powered automation, including automatic change data capture (CDC) Ingest thousands of databases and millions of files, and streaming events. Accelerate time-to-value ROI with self-service access to trusted, high-quality data. Get unbiased, real-world insights on Informatica data engineering solutions from peers you trust. Reference architectures for sustainable data engineering solutions. AI-powered data engineering in the cloud delivers the trusted, high quality data your analysts and data scientists need to transform business. -
38
Catalog
Coalesce
Catalog from Coalesce (formerly CastorDoc) is a data catalog designed for mass adoption across the whole company. Have an overview of all your data environment. Search for data instantly thanks to our powerful search engine. Onboard to a new data infrastructure and access data in a breeze. Go beyond your traditional data catalog. Modern data teams now have numerous data sources, build one truth. With its delightful and automated documentation experience, Catalog makes it dead simple to trust data. Column-level, cross-system data lineage in minutes. Get a bird’s eye view of your data pipelines to build trust in your data. Troubleshoot data issues, perform impact analyses, comply with GDPR in one tool. Optimize performance, cost, compliance, and security for your data. Keep your data stack healthy with our automated infrastructure monitoring system.Starting Price: $699 per month -
39
Key Ward
Key Ward
Extract, transform, manage, & process CAD, FE, CFD, and test data effortlessly. Create automatic data pipelines for machine learning, ROM, & 3D deep learning. Removing data science barriers without coding. Key Ward's platform is the first end-to-end engineering no-code solution that redefines how engineers interact with their data, experimental & CAx. Through leveraging engineering data intelligence, our software enables engineers to easily handle their multi-source data, extract direct value with our built-in advanced analytics tools, and custom-build their machine and deep learning models, all under one platform, all with a few clicks. Automatically centralize, update, extract, sort, clean, and prepare your multi-source data for analysis, machine learning, and/or deep learning. Use our advanced analytics tools on your experimental & simulation data to correlate, find dependencies, and identify patterns.Starting Price: €9,000 per year -
40
Interview Kickstart
Interview Kickstart
Customizable tech interview prep courses designed by 500+ instructors and mentors. Get trained and mentored by tech leads, hiring managers, and recruiters from tech companies Sharpen your skills with technical and career coaching and 1:1 mentorship sessions with instructors. Live interview practice in real-life simulated environments with interviewers from tech companies. Constructive, structured, and actionable insights for improved interview performance. Company, level, and role-specific strategies based on real, proprietary data. Resume building, LinkedIn profile optimization, personal branding, and live behavioral workshops. Our program is designed, taught, and continuously refined by tech experts and top hiring managers. Upskill with the latest AI skills and nail your next tech interview. Guided interview prep to level up into companies. All courses developed and taught by experienced FAANG instructors.Starting Price: Free -
41
CData Sync
CData Software
CData Sync is a universal data pipeline that delivers automated continuous replication between hundreds of SaaS applications & cloud data sources and any major database or data warehouse, on-premise or in the cloud. Replicate data from hundreds of cloud data sources to popular database destinations, such as SQL Server, Redshift, S3, Snowflake, BigQuery, and more. Configuring replication is easy: login, select the data tables to replicate, and select a replication interval. Done. CData Sync extracts data iteratively, causing minimal impact on operational systems by only querying and updating data that has been added or changed since the last update. CData Sync offers the utmost flexibility across full and partial replication scenarios and ensures that critical data is stored safely in your database of choice. Download a 30-day free trial of the Sync application or request more information at www.cdata.com/sync -
42
Spring Cloud Data Flow
Spring
Microservice-based streaming and batch data processing for Cloud Foundry and Kubernetes. Spring Cloud Data Flow provides tools to create complex topologies for streaming and batch data pipelines. The data pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks. Spring Cloud Data Flow supports a range of data processing use cases, from ETL to import/export, event streaming, and predictive analytics. The Spring Cloud Data Flow server uses Spring Cloud Deployer, to deploy data pipelines made of Spring Cloud Stream or Spring Cloud Task applications onto modern platforms such as Cloud Foundry and Kubernetes. A selection of pre-built stream and task/batch starter apps for various data integration and processing scenarios facilitate learning and experimentation. Custom stream and task applications, targeting different middleware or data services, can be built using the familiar Spring Boot style programming model. -
43
PrepBytes
PrepBytes
PrepBytes is an initiative to help students in their placement preparations targeting software development/engineering, analyst & product-based roles in IT/internet/analytics companies. We are a team of graduates from IITs/NITs having work experience in fast-paced start-ups & top companies across the globe. We know what is required for better placement preparations, where the gap is and ensure your success in placements. PrepBytes students get mentored by industry experts who have achieved in their careers and are passionate about helping students achieve their dream. Technical and aptitude test is a very important process of most of the placement tests. Crack your next placement with a series of PrepBytes practice and mock tests. Practice subject-wise and company-wise tests. Take real-time mock tests with other students and test your preparation. We devise a customized plan for you keeping your aspirations in mind.Starting Price: $28.10 one-time payment -
44
Hevo
Hevo Data
Hevo Data is a no-code, bi-directional data pipeline platform specially built for modern ETL, ELT, and Reverse ETL Needs. It helps data teams streamline and automate org-wide data flows that result in a saving of ~10 hours of engineering time/week and 10x faster reporting, analytics, and decision making. The platform supports 100+ ready-to-use integrations across Databases, SaaS Applications, Cloud Storage, SDKs, and Streaming Services. Over 500 data-driven companies spread across 35+ countries trust Hevo for their data integration needs. Try Hevo today and get your fully managed data pipelines up and running in just a few minutes.Starting Price: $249/month -
45
Load your data into or out of Hadoop and data lakes. Prep it so it's ready for reports, visualizations or advanced analytics – all inside the data lakes. And do it all yourself, quickly and easily. Makes it easy to access, transform and manage data stored in Hadoop or data lakes with a web-based interface that reduces training requirements. Built from the ground up to manage big data on Hadoop or in data lakes; not repurposed from existing IT-focused tools. Lets you group multiple directives to run simultaneously or one after the other. Schedule and automate directives using the exposed Public API. Enables you to share and secure directives. Call them from SAS Data Integration Studio, uniting technical and nontechnical user activities. Includes built-in directives – casing, gender and pattern analysis, field extraction, match-merge and cluster-survive. Profiling runs in-parallel on the Hadoop cluster for better performance.
-
46
PREP
PREP
PREP is a PDF remediation tool that helps teams make documents accessible quickly and at scale. It delivers up to 95% auto-tagging powered by AI, which can reduce manual effort by about 80%. A side-by-side screen reader preview allows real-world accessibility validation before files are released. Built to support compliance goals, PREP aligns with WCAG 2.2 AA, PDF/UA, ADA Title II, and Section 508. It fits into enterprise environments with REST API and LMS integration for workflow automation, and it follows security practices that include SOC 2 Type II, GDPR alignment, and encrypted data handling.Starting Price: $75 Monthly -
47
Observo AI
Observo AI
Observo AI is an AI-native data pipeline platform designed to address the challenges of managing vast amounts of telemetry data in security and DevOps operations. By leveraging machine learning and agentic AI, Observo AI automates data optimization, enabling enterprises to process AI-generated data more efficiently, securely, and cost-effectively. It reduces data processing costs by over 50% and accelerates incident response times by more than 40%. Observo AI's features include intelligent data deduplication and compression, real-time anomaly detection, and dynamic data routing to appropriate storage or analysis tools. It also enriches data streams with contextual information to enhance threat detection accuracy while minimizing false positives. Observo AI offers a searchable cloud data lake for efficient data storage and retrieval. -
48
K2View
K2View
At K2View, we believe that every enterprise should be able to leverage its data to become as disruptive and agile as the best companies in its industry. We make this possible through our patented Data Product Platform, which creates and manages a complete and compliant dataset for every business entity – on demand, and in real time. The dataset is always in sync with its underlying sources, adapts to changes in the source structures, and is instantly accessible to any authorized data consumer. Data Product Platform fuels many operational use cases, including customer 360, data masking and tokenization, test data management, data migration, legacy application modernization, data pipelining and more – to deliver business outcomes in less than half the time, and at half the cost, of any other alternative. The platform inherently supports modern data architectures – data mesh, data fabric, and data hub – and deploys in cloud, on-premise, or hybrid environments. -
49
Adele
Adastra
Adele is an intuitive platform designed to simplify the migration of data pipelines from any legacy system to a target platform. It empowers users with full control over the functional migration process, while its intelligent mapping capabilities offer valuable insights. By reverse-engineering data pipelines, Adele creates data lineage mappings and extracts metadata, enhancing visibility and understanding of data flows. -
50
Lightbend
Lightbend
Lightbend provides technology that enables developers to easily build data-centric applications that bring the most demanding, globally distributed applications and streaming data pipelines to life. Companies worldwide turn to Lightbend to solve the challenges of real-time, distributed data in support of their most business-critical initiatives. Akka Platform provides the building blocks that make it easy for businesses to build, deploy, and run large-scale applications that support digitally transformative initiatives. Accelerate time-to-value and reduce infrastructure and cloud costs with reactive microservices that take full advantage of the distributed nature of the cloud and are resilient to failure, highly efficient, and operative at any scale. Native support for encryption, data shredding, TLS enforcement, and continued compliance with GDPR. Framework for quick construction, deployment and management of streaming data pipelines.