The programmable data streaming platform
Streamflow
streamflow.financesari added
I went to a ClickHouse meetup here in Dubai.
Never used the tool, but I'm happy to learn.
ClickHouse is:
- a database
- built for analytics
- open source
Furthermore, it:
- can be run on a single node or in a cluster
- stores data in columnar format
- uses both "vectorized query execution" and "runtime code generation" to maximize CPU usage
In the meetup, Cl... See more
Never used the tool, but I'm happy to learn.
ClickHouse is:
- a database
- built for analytics
- open source
Furthermore, it:
- can be run on a single node or in a cluster
- stores data in columnar format
- uses both "vectorized query execution" and "runtime code generation" to maximize CPU usage
In the meetup, Cl... See more
Feed | LinkedIn
Rill is the fastest path from data lake to dashboard.
Unlike most BI tools, Rill comes with its own embedded in-memory database. Data and compute are co-located, and queries return in milliseconds.
So you can pivot, slice, and drill-down into your data instantly.
Download Rill to start modeling data and create fast, exploratory dashboards.
Unlike most BI tools, Rill comes with its own embedded in-memory database. Data and compute are co-located, and queries return in milliseconds.
So you can pivot, slice, and drill-down into your data instantly.
Download Rill to start modeling data and create fast, exploratory dashboards.
rilldata • GitHub - rilldata/rill: Rill is a tool for effortlessly transforming data sets into powerful, opinionated dashboards using SQL. BI-as-code.
Nicolay Gerold added
Data Transformation
Built For Growth
Don't hack custom scripts or use half-baked tools. SQLMesh ensures accurate and efficient data pipelines with the most complete DataOps solution for transformation, testing, and collaboration.
Built For Growth
Don't hack custom scripts or use half-baked tools. SQLMesh ensures accurate and efficient data pipelines with the most complete DataOps solution for transformation, testing, and collaboration.
SQLMesh
Nicolay Gerold added
When we think about a world where everyone is connected, where everything is a node on the network, where every node can publish and subscribe, we can understand the need for the stream/filter/drain architecture.
JP Rangaswami • Filters: Part 2: Thinking about the network as filter
sari added
FastStream brokers provide convenient function decorators @broker.subscriber and @broker.publisher to allow you to delegate the actual process of:
These decorators make it easy to specify the processing logic for your consumers and producers, allowing you to ... See more
- consuming and producing data to Event queues, and
- decoding and encoding JSON encoded messages
These decorators make it easy to specify the processing logic for your consumers and producers, allowing you to ... See more
airtai • GitHub - airtai/faststream: FastStream is a powerful and easy-to-use Python framework for building asynchronous services that interact with event streams such as Apache Kafka and RabbitMQ.
Nicolay Gerold added
Producing: One part of your system (a producer) wants to send a message, which could be anything like an order from a customer, a sensor reading, a log message, etc.
Consuming: Another part of your system (a consumer) wants to receive and process that message.
FastStream acts as a postman, ensuring the message is delivered accurately and efficiently from producer to consumer.
Applications:
E-commerce Platform: When a customer places an order, a message regarding that order needs to be communicated to various services like Order Management, Inventory Management, Payment Gateway, and more. FastStream will handle these messages, validate the order data, and ensure that all relevant microservices are informed in real-time, ensuring a smooth ordering process.
IoT Sensor Network: Consider a network of IoT devices sending sensor readings (temperature, humidity, etc.) to a central server. FastStream ensures that all these readings are valid, documented, and are efficiently communicated to the central server for further analysis or action.
Financial Transactions: In a financial application, transactions and trading data can be passed through FastStream to validate transactions, communicate them to various analytic and logging microservices, and even trigger alerts or actions based on the data.
The last core data stack tool is the orchestrator. It’s used quickly as a data orchestrator to model dependencies between tasks in complex heterogeneous cloud environments end-to-end. It is integrated with above-mentioned open data stack tools. They are especially effective if you have some glue code that needs to be run on a certain cadence, trigg... See more
Data Engineering • The Open Data Stack Distilled into Four Core Tools
Nicolay Gerold added
Infrastructure to exchange resources with unknown people and businesses.
Seth Rosenberg • First Principles of Investing in FinTech
sari added