All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
sqlserver (#196)datacontract export --format dbml: Export to Database Markup Language (DBML) (#135)datacontract export --format avro: Now supports config map on field level for logicalTypes and default values Custom Avro Propertiesdatacontract import --format avro: Now supports importing logicalType and default definition on avro files Custom Avro Propertiesconfig.bigqueryType for testing BigQuery typesimport through the glue-table parameter (#122)datacontract catalog Show search bar also on mobiledatacontract catalog Searchdatacontract publish: Publish the data contract to the Data Mesh Managerdatacontract import --format bigquery: Import from BigQuery format (#110)datacontract export --format bigquery: Export to BigQuery format (#111)datacontract export --format avro: Now supports Avro logical types to better model date types. date, timestamp/timestamp-tz and timestamp-ntz are now mapped to the appropriate logical types. (#141)datacontract import --format jsonschema: Import from JSON schema (#91)datacontract export --format jsonschema: Improved export by exporting more additional informationdatacontract export --format html: Added support for Service Levels, Definitions, Examples and nested Fieldsdatacontract export --format go: Export to go types formatazure (#146)delta tables on S3 (#24)datacontract catalog that generates a data contract catalog with an index.html file.datacontract export --format sql for Databricks dialectsdatacontract export --format great-expectationsdatacontract export --format sqlsql_type_converter to build checks.datacontract test --publish-to-opentelemetrydatacontract export --format protobufdatacontract export --format terraform (limitation: only works for AWS S3 right now)datacontract export --format sqldatacontract export --format sql-querydatacontract export --format avro-idl: Generates an Avro IDL file containing records for each model.datacontract changelog datacontract1.yaml datacontract2.yaml will now generate a changelog based on the changes in the data contract. This will be useful for keeping track of changes in the data contract over time.datacontract lint will now check for a variety of possible errors in the data contract, such as missing descriptions, incorrect references to models or fields, nonsensical constraints, and more.datacontract import --format avro will now import avro schemas into a data contract.datacontract export --format avroThis is a huge step forward, we now support testing Kafka messages. We start with JSON messages and avro, and Protobuf will follow.
datacontract import --format sql (#51)datacontract export --format dbt-sourcesdatacontract export --format dbt-staging-sqldatacontract export --format rdf (#52)datacontract breaking to detect breaking changes in between two data contracts.This is a breaking change (we are still on a 0.x.x version). The project migrated from Golang to Python. The Golang version can be found at cli-go
test Support to directly run tests and connect to data sources defined in servers section.test generated schema tests from the model definition.test --publish URL Publish test results to a server URL.export now exports the data contract so format jsonschema and sodacl.--file option removed in favor of a direct argument.: Use datacontract test datacontract.yaml instead of datacontract test --file datacontract.yaml.model is now part of exportquality is now part of exportdiff needs to be migrated to Python.breaking needs to be migrated to Python.inline needs to be migrated to Python.model command parameter, type -> format.schema command.models section for diff/breaking.model command.inline print to STDOUT instead of overwriting datacontract file.quality write input from STDIN if present.test command for Soda Core.diff/breaking.diff/breaking.diff/breaking.schema command: prints your schema.quality command: prints your quality definitions.inline command: resolves all references using the “$ref: …” notation and writes them to your data contract.--file, --with).diff command for dbt schema specification.breaking command for dbt schema specification.init when the file already exists.validate command to lint.check-compatibility command.