adesso Blog

Tags:

  • Data management

Show all posts
Industries

18.11.2024 By Wolfgang Weber and Christian Blank

The transformation from a gas to a hydrogen network

Picture

The path to complete decarbonisation will also have a massive impact on gas network operators: parts of the infrastructure will probably no longer be needed in the long term, while other parts will have to be upgraded for operation with hydrogen. In this blog post, we outline a procedure for preparing the switch to hydrogen. The data required for this can also be used for other purposes.

Read more
Methodology

In an age when efficiency and rapid adaptability are crucial, process mining offers significant added value: it enables companies to optimise their business processes in a data-driven way. In this blog post, we explain in more detail how process mining can help – in virtually every implementation, regardless of industry or process.

Read more
Methodology

In modern data processing, companies are faced with the challenge of choosing the right database technology for their specific requirements. PostgreSQL and Databricks are two widely used solutions, each with their own strengths. In this blog post, I will highlight the differences between PostgreSQL and Databricks, analyse their respective advantages and disadvantages and give specific use cases that justify a switch to Databricks.

Read more
Methodology

06.06.2024 By Christian Del Monte

Change Data Capture for Data Lakehouse

Picture Christian Del Monte

Change Data Capture (CDC) is a technique that captures all data changes in a data archive, collects them and prepares them for transfer and replication to other systems, either as a batch process or as a stream. This blog post focuses on the application of CDC in data lakehouses using the example of Change Data Feed, a variant of CDC developed by Databricks in the context of delta lake-based data lakehouses.

Read more
Methodology

In a distributed software system, data changes always pose a challenge. How is it possible to track the change history of data located in one part of the system in order to synchronise connected data stores in other subsystems? Change Data Capture (CDC) offers an answer to this question. I explain what this is all about in this blog post.

Read more
Industries

Every day, employees struggle with manual reporting processes that cause high personnel costs, limited opportunities for process optimisation and quality deficiencies. Despite the crucial importance of KPIs for management, manual reporting processes are widespread in production. In this blog post, I explain why companies in the IIoT sector are starting with production reporting.

Read more
Methodology

Metadata-driven data pipelines are a game changer for data processing in companies. These pipelines use metadata to dynamically update processes instead of manually revising each step every time a data source changes. As with data pipelines, metadata maintenance can be a bottleneck in the maintenance and further development of a pipeline framework. In this blog post, I use practical examples to show how the Jsonnet template language makes it easier to maintain metadata.

Read more
AI

Workflow orchestration and workflow engines are crucial components in modern data processing and software development, especially in the field of artificial intelligence (AI). These technologies make it possible to efficiently manage and coordinate various tasks and processes within complex data pipelines. In this blog post, we present Prefect, an intuitive tool for orchestrating workflows in AI development.

Read more
Methodology

Snowflake plays a prominent role in shaping the face of the industry in the ever-evolving world of data analytics and data management. This blog post looks at the development of Snowflake and why it is considered a ground-breaking solution for businesses.

Read more

Save this page. Remove this page.