New Features

DuckDb In External Runner

We added experimental support for DuckDb as Loader and Exporter in the External Runner.

For loading, this yields better performance (around 25% faster) than the current approach. The Loader can be activated by Setting Hedda:ExperimentalDuckDbLoader Configuration on External Runner Container Config.

For exporting, DuckDb was also implemented as an option. Although the performance is currently worse (3x slower) than the current implementation, there might be potential in the future. It can be activated with the Hedda:ExperimentalDuckDbExporter Config.

DuckDb Preview Reader

DuckDB was introduced as a new option for the Preview Reader configuration. Choosing DuckDB, which is directly executed in the Frontend Service, helps reduce the overall server count, eliminating the need for Trino and Hive installations. With proper configuration, DuckDB can even outperform Trino Reader. To utilize DuckDB, we suggest at least a P2 Plan for your App Service and implementing a mounted Premium Storage to store the Database File.


“Preview”: {
“ReaderType”: “oh22.HEDDAIO.PreviewHandling.DuckDb.DuckDbReader”,
“Reader”: {
“AccountName”: “{ADLS Account Name}”,
“Container”: “{ADLS Container/ FileSystem}”,
“DataSource”: “{Database Storage Path; Default: ‘duck.db’}”,
“TenantId”: “{TenantId*}”,
“ClientId”: “{ClientId*}”,
“ClientSecret”: {ClientSecret*},
“StorageType”: {StorageType*:adls|blob; Default:adls}

The Configuration of TenantId, ClientId and ClientSecret is optional on Azure Deployments.

Custom Service Principal

We have added the possibility to use a custom Service Principal when connecting to Azure Storage or MSSQL.
Simply select Custom Service Principal when choosing an Authentication Type for your Connection and enter the required credentials.

Lakehouse Connection

Microsoft Fabric Lakehouse has been added as a new Connection Type.

Use LUNA.UI as unified UI Framework

At oh22, we are now using our own unified UI Framework LUNA.UI to speed up and improve development of new UI features in all our products, something all users will benefit from!


  • Improved performance when publishing a Knowledge Base
  • Improved performance when loading Domains
  • Improved Transport Schema Logic for Transport where it is costly to load schema directly. Eg. Databricks.
  • Further DataType optimizations for optimal performance


  • Fixed a bug where a reference was lost for Data Link Mappings, when a Domain Name changed.
  • ContainerTrinoReader now correctly handles a previously unknown container revision state, instead of showing an error
  • The Publish Knowledge Base window now indicates if changes to the metadata of the Knowledge Base were made. Metadata includes the name, description, data responsibility office and category of a Knowledge Base.
  • Fixed a bug where a user working on a knowledgebase was switched to another knowledgebase which another user published.
  • Fixed an issue where the Get Files button would not correctly react to form changes in the External Connection window

Package Updates



  • Improved Logging in .net Runner for better Debugging experience
  • Improved Queue Handling in .net Runner to accommodate Stale Queues


  • Fixed an issue with NULL values in Lookup Key Domains
  • Fixed an issue in Handling Special Decimals created by native Spark Parquet Uploader
  • Fixed an issue with Dates before year 100 in INT96 dateformat

Breaking Changes

Dotnet Runner

The dotnet runner execution is now asynchronous this changes how the library gets called when called directly:


var runner = Hedda.Create(apiUrl, apiKey)
// Configure project, knowledge base, domain mappings, etc.
// Pass data to the client
// Execute on this Run
var result = await runner.Start();

New Features

Business Rule name

The list of characters usable in Business Rule names has been changed to upper and lower case letters, numbers, spaces ( ), underscores (_) and dots (.).
This change will take effect for all Knowledge Bases to be published. Currently active Knowledge Bases won’t be effected by this change.

Preview Reader Retry Policy

We have added two optional settings to the Preview Reader configuration:
  • RetryDelay: Specifies the delay in seconds between attempts of retrieving the current Trino Server status. The default value is 5.
  • RetryAttempts: Specifies the maximum amount of retry attempts for retrieiving the current Trino Server status. The default value is 6.
Example configuration:

“Preview”: {
“ReaderType”: “oh22.HEDDAIO.PreviewHandling.Trino.TrinoReader”,
“RunnerType”: “oh22.HEDDAIO.PreviewHandling.Local.LocalExecutor”,
“Reader”: {
“User”: “username”,
“Url”: “https://example.trino.url”,
“Schema”: “default”,
“DataLakeAccountName”: “account-name”,
“DataLakeContainer”: “container-name”,
“RetryDelay”: 10,
“RetryAttempts”: 3

Preview Domain Comparison

Added a new Feature to the Preview Grid where a Domain can be selected for comparison with its Original Value.
By selecting the Domain via `Domain Selection > Select for Comparison` or via the new Header Menu two new Columns get added which show the Original Value and the Last applied Business Rule and Rulebook.

Preview Ordering

Added possibility to Order by Domains in Preview.


  • Reduced the amount of requests to Azure when using the ContainerTrinoReader configuration
  • Added the possibility to match Data Links to existing External Connections in a Project, when importing a Knowledge Base
  • Changed default for IsReadOnly flag to True for Importing Domains.
  • Improved the Performance of the dotnet Runner significantly
  • Improved Preview Grid by adding a Header Menu to quickly change Domain Display behavior
  • Improved performance of loading the Schema information from Databricks by parallelizing it.


  • Resolved error toast messages appearing after deleting External Connections or Runs
  • Fixed an issue with Container Preview Reader when setting very low ShutdownAfterMinutes values
  • Fixed an issue with Preview Page showing no loading spinner
  • Fixed an issue, where after starting the Preview Reader, “Hive not accessible” would sometimes appear as an error message
  • Fixed an issue in Preview where Preparation Domains were not applied correctly
  • Fixed an issue in Preview where Last Rule did not checked for Changes in Preparation
  • Fixed an issue where removing a Domain Mapping from a filtered list leads to wrong removal

New Features

Direct Links

Case Insensitive Key Mapping

It is now possible to select weather the Key Mapping should be made case sensitive or case insensitive. This allows for more Flexibility in matching with your master Data.

Exclusion of Columns

To mitigate heavy data load and memory pressure of large and wide tables we have introduced the possibility to exclude columns which will not be used in conditions or formulas.

Analyse Preview Data

Extended Domain Filter

The Domain Filter option was extended.

  • Numbers

– Numbers can now be prefixed with >, <, >= and <= to filter respectively eg. > 1.2 will filter for numbers greater 1.2. if the prefix is omitted the default operator = will be used.

  • Dates

–  The format for Dates has now the possibility to consist only of the date portion eg. 2024- 01-01 this will then filter all the rows which have the same date portion. Dates containing the time portion, eg. 2024-01-01T00:00:00, will respect the time portion, so the example would only find dates which actually represent midnight on January 1st, 2024. Microseconds will be ignored.

–  Similar to Numbers , dates respect the prefixes >, <, >= and <=. Eg. < 2024-01-01 will filter for dates smaller than January 1st, 2024.

–  The token $NOW was introduced to be able to filter for dates in comparison to the current timestamp.Eg. >= $NOW.

–  The token $TODAY was introduced to be able to filter for dates in comparison to the current date. Eg.<= $TODAY.

  • The NULL value has changed from <NULL> to $NULL.



  • Fixed Import Knowledge Base from Databricks were Precision, Scale and Length were not correctly set.
  • Fixed an Issue in Databricks All Purpose Api where wide tables did not return all Data.
  • Fixed an Issue with Precision Validation in Dotnet Runner on some cases.
  • Fixed an Issue in Forms where an empty description could be entered in RichText Component.
  • Minor UI Fixes.


Member Select in Business Rule

We introduced a new feature where you can select a Value from the Member List, analog to referencing the Value of another Domain or Lookup. To achieve this, you require a Domain with Member Search enabled being on the Domain Selection of the Condition. In the Value Field you can now enter the @ Symbol as you would do to reference another Domain or Lookup. A new Tab “Member” should be available. Selecting it will load the List of Parent Members from which you can choose. Note: Other than a reference to a Lookup or Domain, this will not insert a reference to the Member Value but enter the selected value directly, meaning there is no connection or information left that this value was chosen from the Member List.


We added a new possibility to manipulate processed data which helps achieving new possibilities in the Business Rule processing. Initially started as Formulas, which should help utilize better data e.g. fixed casing or trimmed strings to be used in Conditions, it soon transformed into a new Step in the Processing of a Business Rule, the “Preparation”. This approach allows to clean the data before being utilized in a Condition, it can also be used to have something like “calculated” Domains which can be directly utilized on the left and right hand side of a condition, which also includes setting those Domains from Data Links, etc. So values from Data Links on the Left side of a condition are now possible too.

The Following Formula Items are included:

For Strings
  • String: Provides a String value either directly or from a reference like Lookup or Domain
  • Concat: Concatenates multiple Strings together, an optional Separator can be entered
  • Lower: Transforms to Lower Case
  • Upper: Transforms to Upper Case
  • Regex Replace: Performs a regex Transformation, Capture Groups can be accessed with $1, $2, … in Replace Field
  • Replace: A Regular String Replace
  • SubString: Gets a part of a String from a zero based Start value and an optional Length
  • Trim: Removes white spaces from String. Either on both sides or only from front or end
  • String from Date: Creates a String from a Date with a specified Format
  • String from Number: Creates a String from a Number with a specified Format
  • String from Boolean: Creates a String from a Boolean where the representing Bool Values can be entered
For Numbers
  • Number: Provides a Number value either directly or from a reference like Lookup or Domain
  • Addition: Adds multiple Numbers
  • Multiplication: Multiplies multiple Numbers
  • Subtraction: Subtracts multiple Numbers starting from top to bottom
  • Division: Divides multiple Numbers starting from top to bottom
  • Round: Rounds the Number to desired decimal count. Rounding can be either up, down or commercial
  • Number from String: Transforms a Number from a String with possibility to provide thousand and decimal separators
  • Number from Date: Takes the Seconds from Unix Epoch
For Dates
  • Date: Provides a Date value either directly or from a reference like Lookup or Domain
  • Add Time: Adds Time to a Date
  • Date from Number: Creates a date from a Number based on Unix Epoch seconds
  • Date from String: Parses a date from a String based on provided Format
For Boolean
  • Boolean: Provides a Boolean value either directly or from a reference like Lookup or Domain
  • Boolean from String: Creates a Boolean from a string where the True and False Value can be Provided
  • Boolean from Number: Creates a Boolean from a Number(0or1), otherwise null

 ‘AlwaysTrue’ logical Condition

An ‘Always True’ logical Condition is now available. This new Condition will always be met successfully regardless of nested Conditions and can be used to e.g. skip the evaluation to immediately proceed with the Action of a Business Rule. Conditions within the ‘Always True’ Condition remain editable.

Databricks SQL Warehouse API Support

Added Databricks SQL Warehouse support to HEDDA.IO which has performance improvements over the Databricks Command API and should drastically reduce load times for lookup and preview operations which rely on Databricks.


  • Improved performance when loading statistics
  • Improved Value Field to not show Lookup Tab if no Lookups are present
  • Added multiple receivers to Twilio alert sink
  • Removed profiling from python example code temporarily
  • The Data Type Validation option will now be selected by default in the Runs Execution Widget under the Domains tab
  • Improved DotnetRunner Performance
  • Extended Business Rule Node in Rulebook Overview. Condition will now be partially shown directly in the node and can be viewed in full on click without leaving the page.
  • Hardened publishing to exclude generated items through the API within valid references 
  • Improved default templates for events
  • Drastically improved load times when initially loading a Project in the UI
  • Added Collapsible Sidebar
  • Added Live / Edit Switch to Global Search
  • Added Search to Mapping Detail and Form


Variable Domains

Domains can now be marked as Variables. Variable Domains can still be used in BusinessRules and Actions but will not appear in Mappings as they are not read from the Source Data. This means that they no longer need to be present in eg. the source data frame. On top of this Variable Domains can also have a Default Value configured. On Preview Screen, Variable Domains are hidden by default but can be made visible through the Domain Selection.

Last Applied Changes in Preview

The Preview Row Result View will now display the Last Rule which made a change to a specific Domain. Every Domain which has changed during the execution will display an Info Icon on hover. Clicking on it will display the Last Rule and Rulebook.

Databricks Transport

Support for Databricks as a Transport has been implemented. It is now possible to utilize Databricks as Lookup or External Member Source. Furthermore Databricks can be used to create Knowledge Bases. This Transport uses the General Cluster, a version with SQL Warehouse Cluster support is also planned for the future.

Support Page

The Support Page now gives you the possibility to submit an email ticket to our customer support. Enter a subject and click the corresponding button to open your email client.