Hello, and welcome to this video. I am Nadeem Siddique. And today, we're going to discuss one of our most interesting use case, which is data model to data marketplace. In this video, we are going to see how a data set is created and then delivered to the end users.
Our use case includes multiple personas. It starts with a request that is made by an end user to create a data set. This request is received by the data architects who uses erwin DM. Logical physical models for the data sets are created by the architects. DDL is generated. It is executed by the subject matter experts in the respective data sources.
The architects can use the mod administrator to push definitions, descriptions, sensitive data information, user-defined properties from data model to Data Intelligence Suite. Technical data stewards then create source to target mappings and execute the code generation connectors. This pulls data from the sources and pushes it into the target. That is the desired data set. This data set can now be made available in erwin marketplace and made accessible to end users upon request.
Now, let's look at all of this in our demonstration. So the first thing that we discussed was a request coming in by an end user. So we can see over here that the request for the creation of a new data set, which is customer classification, has come in by one of the end users. This request has also been assigned to various users, including technical data stewards, architects who are users of either the data intelligence or the erwin Data Modeler. Constant communication between the end user as well as the users who have been assigned tasks also keeps on happening.
As we discussed, once the request for the creation of a data set comes in, it's the task of an architect to go ahead, use the DM, and create logical physical models. So over here, we can see that we have a logical as well as physical model available for our customer classification data set.
Now, the structure for that data set is ready. And it's an out of the box capability of erwin DM to be able to provide details that could be executed in the respective data source to have this structure ready.
As shown on the screen now, using DM to DI Connect, the architects can push all the definitions, descriptions, user-defined properties from the models into the desired system and environments within Data Intelligence Suite.
And as you can see on the screen now, once the push from DM to DI is successful, you can look at all the definitions, descriptions, logical names, sensitive data indicators that were defined on your models on the data set now available within the Data Intelligence Suite.
The next step is for the technical data stewards to go in and create source to target mappings, as you can see on the screen right now. You can look at these mappings in the tabular format, or you can look at it in a graphical format as well. We have our sources being mapped into the target. That is my customer classification data set.
After the source to target mappings are done, it's a matter of running or executing the forward engineering connector that is configured on your environments. And for the stewards, it is a matter of clicking on a few buttons. As simple as it may have sounded, what we are doing here with this technology is pulling data from these sources and pushing it into the target to provide you the actual data set. We are generating data pipeline over here.
With all of this, lineage simultaneously also gets created. And we can look at that as well. This is the data set that was created in the source and is now available for you.
Now that the data set is created, data stewards can work on making this available in erwin marketplace and continue to govern it. Thank you for listening to this video. If you find this interesting, please reach out to us. And we can schedule a detailed demo.