[REPLAY] Product Talk: Using AI to enhance the data marketplace search experience

Watch the replay
Product

Deploy large-scale data projects with API Automation

To learn about the API’s technical capabilities and the benefits it provides to data portal administrators and the wider organization, we interviewed Coralie Lohéac, Lead Product Manager, and Hugo Bost, Software Engineer, part of the team overseeing the project's development.

Brand content manager, Opendatasoft
More articles

To ensure that your data portal is robust and to guarantee the smooth management of large projects, Opendatasoft has developed its API Automation feature, enabling customers to fully automate key areas of portal management.

To learn about the API’s technical capabilities and the benefits it provides to data portal administrators and the wider organization, we interviewed Coralie Lohéac, Lead Product Manager, and Hugo Bost, Software Engineer, part of the team overseeing the project’s development.

Copy to clipboard

Coralie:

API Automation automates the management of the entire back office of data portals created with Opendatasoft. It helps ensure that the portal is completely interoperable with all the other tools that make up an organization’s Information Systems (IS).

Data portals are becoming central to how organizations share and use their data. We therefore developed the API to meet our customers’ growing needs to deploy large-scale data portals that seamlessly integrate with their other data management tools.

Across their tech stack, organizations are deploying a range of highly specialized tools to manage areas such as data quality, storage, governance, before data is then shared via their data portal. To industrialize data sharing at scale they need to be able to guarantee quality across the end-to-end data process. That makes it essential to use APIs to automatically manage data flows and guarantee the integrity of data updates.

With API Automation, our customers can now safeguard quality, minimize the need for human involvement and therefore scale up their data projects.

Hugo:

In other words, Opendatasoft’s API Automation feature is a technical tool that enables customers to ensure the Opendatasoft platform communicates seamlessly with the other applications within their tech stack, without having to use the portal’s management interface. The result is fluid, rapid and automated integration.

All you need to do is configure the process you’d like to achieve, such as by writing a script. Once you’ve created and deployed this it will automatically run, ensuring data flows are always updated and guaranteeing the integrity of the data on your Opendatasoft portal.

Copy to clipboard

Coralie:

API Automation automates over 150 actions on the platform, from data publication, through domain management, to user permissions. Organizations can therefore pick the actions that meet their specific needs, automating the management of all, or just part, of their data flows. For example, many of our customers use it to automate the management of their metadata, to ensure that this is always reliable and accurate. Through the API you can update metadata en masse, with any modifications immediately reflected in your other tools.

Other customers use it to automate the entire process of publishing data on their portal. If a change occurs in another IS tool, such as new data becoming available, user permission modifications, or updates to the confidentiality level of a visualization, then this is automatically reflected in the platform, guaranteeing high-quality, seamless integration with datasets and dashboards.

Copy to clipboard

Coralie:

Since we made the feature available, we’ve seen over 8.5 million API calls made using it in just one month. This includes 1.7 million calls to create objects including datasets, pages and metadata in Opendatasoft. This demonstrates incredibly rapid adoption, and shows that it is already delivering real value to our customers.

Customers are seeing benefits in two main areas. Firstly, thanks to the automation of data management processes, they are saving time by removing the need to manually update data or set up the portal. For larger projects, scripting can be completed very quickly, eliminating some or all of the manual tasks involved in portal management.

Secondly, API Automation enables data projects to be deployed on a larger scale. This makes it possible to manage larger quantities of data, and removes the technical obstacles to publishing and sharing data on an industrial scale, resulting in a significant return on investment.

Hugo:

API Automation minimizes the risk of human error. This is essential not only to guarantee security of the organization’s data, but also provides real reassurance and confidence in data reliability and quality.

APIs also enable data portals to deliver better services, as data is updated without human intervention. For some companies, this is essential, such as in the energy sector, where consumption data is constantly changing and needs to be updated as regularly as possible.

Copy to clipboard

Coralie:

The API Automation feature was developed so that it could be used autonomously by our customers, ensuring that it is fully interoperable with the other data management tools they use. This was essential to widening its use, and therefore ensuring it has a real, positive impact on the jobs of all data portal managers.

At an organizational level, it’s also a solution for sharing resources and information, while maintaining the highest levels of security, control and confidentiality.

We’ve built it on the REST API architecture. As well as being technically strong, its popularity delivers a number of advantages – for example, it’s very easy to find tools and tutorials for working with REST APIs online.

Hugo:

API Automation is an HTTP RESTful API. RESTful APIs are not a protocol or a standard, but respond to a set of architectural constraints. The API Automation feature is based on the HTTP standard and complies with a set of best practices that facilitate its use and evolution.

Developers are familiar with this type of API and can use them in a variety of ways, through their preferred tools and working methods, which helps explain its popularity.

The choice of a RESTful structure was a natural one, since we wanted to offer an API that was easy for our customers to use. It is scalable and enables us to meet all our customers’ needs when it comes to automating data portal management.

Want to find out more about API Automation? Book a personalized demo with one of our experts today!

Articles on the same topic : Features
En savoir plus
The importance of data quality in turning information into value Data Trends
The importance of data quality in turning information into value

What is data quality and why is it important? We explain why data quality is central to building trust and increasing data use, and the processes and tools required to deliver consistent high-quality data across the organization.

Accelerating public sector data sharing – best practice from Australia Public Sector
Accelerating public sector data sharing – best practice from Australia

Data sharing enables public sector organizations to increase accountability, boost efficiency and meet changing stakeholder needs. Our blog shares use cases from Australia to inspire cities and municipalities around the world

Opendatasoft integrates Mistral AI’s LLM models to provide a multi-model AI approach tailored to client needs Product
Opendatasoft integrates Mistral AI’s LLM models to provide a multi-model AI approach tailored to client needs

To give customers choice when it comes to AI, the Opendatasoft data portal solution now includes Mistral AI's generative AI, alongside its existing deployment of OpenAI's model. As we explain in this blog, this multi-model approach delivers significant advantages for clients, their users, our R&D teams and future innovation.