People sitting at a table

adesso BLOG

13

Tags:

  • Data management

Show all posts
Methodology

06.06.2024 By Christian Del Monte

Change Data Capture for Data Lakehouse

Picture Christian Del Monte

Change Data Capture (CDC) is a technique that captures all data changes in a data archive, collects them and prepares them for transfer and replication to other systems, either as a batch process or as a stream. This blog post focuses on the application of CDC in data lakehouses using the example of Change Data Feed, a variant of CDC developed by Databricks in the context of delta lake-based data lakehouses.

Read more
Methodology

Metadata-driven data pipelines are a game changer for data processing in companies. These pipelines use metadata to dynamically update processes instead of manually revising each step every time a data source changes. As with data pipelines, metadata maintenance can be a bottleneck in the maintenance and further development of a pipeline framework. In this blog post, I use practical examples to show how the Jsonnet template language makes it easier to maintain metadata.

Read more
AI

Workflow orchestration and workflow engines are crucial components in modern data processing and software development, especially in the field of artificial intelligence (AI). These technologies make it possible to efficiently manage and coordinate various tasks and processes within complex data pipelines. In this blog post, we present Prefect, an intuitive tool for orchestrating workflows in AI development.

Read more
Methodology

Snowflake plays a prominent role in shaping the face of the industry in the ever-evolving world of data analytics and data management. This blog post looks at the development of Snowflake and why it is considered a ground-breaking solution for businesses.

Read more
Methodology

In an increasingly digitalised world, the systematic collection, interpretation and use of data is becoming a factor in success. If a company fails to do this, it will lose a key tool needed to stay competitive and innovate in the age of digitalisation. In my blog post, I explain why data governance is important for companies and how adesso can support them in this area.

Read more
Industries

09.06.2023 By Juan Carlos Peñafiel Suárez

Which laboratory challenges can data standards solve?

Picture Juan Carlos Peñafiel Suárez

The era of digitalisation has seen the amount of data available increase exponentially. That is why good communication between instances is still the key to managing it – and data standards play a major role in this. This blog post will explain which challenges data standards can solve.

Read more
Industries

22.05.2023 By Christopher Krafft

A protocol stack for narrowband IoT (part 2)

Picture Christopher Krafft

While the first part of my blog series focused on the technology of NB-IoT, the second part will discuss the technical aspects of using it, as the advantages of transmitting data using narrowband IoT are quickly squandered if it is used incorrectly.

Read more
Methodology

Moving documents from a drive to SharePoint seems simple enough. Just copy and paste the documents and folders, and they are all available to you again a short time later. That means users can continue to work as they normally would, right? Not exactly, since metadata would have been the better option here. And I will explain why in my blog post.

Read more
Industries

Data is the basis for all business processes as well as a company’s value chain. Purchasing customer data records allows companies to place highly customised, targeted product offers, which increases conversion rates and subsequently sales. This is exactly where digital regulation comes in to play. In our blog post, we will explain what this looks like and how regulatory changes affect AI and data projects.

Read more

Save this page. Remove this page.