laitimes

Still don't know what a data service is? Bookmark this data service tool!

Let's start with the industry background. In recent years, enterprises have been engaged in digital construction and digital transformation, and it can be said that they are in full swing. There are a lot of solutions and data architectures produced, but they all hide many pain points, one of the more prominent problems is the agile service of data.

What are the problems with agile data services? First of all, the database architecture is too complex, and the database vendors are even more diverse, which is simply a "hodgepodge" of data. Faced with such a complex situation, enterprises have to pay high salaries to find engineers in different development languages to develop.

The development itself is fine, but it raises a series of tricky questions after it's developed. Take calling the API interface as an example, the stability of the API can be said to be the most difficult to guarantee, in addition, the acquisition of data is also more troublesome, the development team needs to develop the data interface, the business needs are different, the data interface is different, this workload can be imagined.

So how to solve the pain points? This is where the concept of data services comes in. What is a data service? Who does the data serve? How is the data served? Today's article will discuss the concept of data service with you in depth, I hope it can be helpful to you!

1. The concept of data services

Data service is actually a cross-border concept, which is related to several fields such as data integration, full lifecycle API management, DaaS, and EiPaaS (follow-up research and analysis). Truth be told, one of the core points of data services is to make data available as a commodity to different systems and users, and that commodity is available on demand.

To put it bluntly, data service = data as a service, this kind of service should belong to SVIP's high-standard service, regardless of the source, regardless of the type of service, such as aggregation, data quality management, data cleaning, etc. For providers, regardless of whether they are geographically or organizationally separated from consumers, as long as users have needs, they provide data to them according to their specific needs.

So why can the cool, lofty concept of data service be implemented? There are two main reasons for this: service-oriented architecture (SOA) and widely adopted APIs.

Still don't know what a data service is? Bookmark this data service tool!

2. Sub-scenarios of data services

1. Data Release

In the past, data departments produced massive amounts of data, but how to open it up easily and efficiently was a difficult problem. In the era when there was no data service, the way to open data was very simple and crude, and it was generally to export data directly to the other party. This approach is not only inefficient, but also brings many problems such as safety hazards. Now with the data service, the internal business system can directly open data, and for the standard data developed by the platform, users can encapsulate it as API assets through simple configuration for unified management and publishing. More importantly, users can complete the closed loop of data development scenarios by using a single data development tool, which refers to the entire link from data collection to data processing to data sharing within the enterprise. This is like building a highway for data circulation, so that data can flow smoothly within the enterprise, which can be said to be very convenient.

Still don't know what a data service is? Bookmark this data service tool!

2. Data Write

Gartner makes the important point that all integration challenges can be broken down into some combination of the three integration patterns.

Still don't know what a data service is? Bookmark this data service tool!
  • Mode 1: Data consistency integration
  • Mode 2: Multi-step process integration
  • Mode 3: Create a composite service

These three modes are relatively simple, so I won't explain too much to you. Why data integration? Because the data service accesses data in real time through API interfaces, it can be said that it further enriches the way of data collection and solves most of the above problems.

In addition, when data services are combined with streaming ETL, the parsing processing logic and write to the destination can be flexibly customized. What does this mean? In other words, the company has the final say. Enterprises can flexibly set data parsing rules and write targets based on their business needs.

However, this combination is not perfect, and it has certain limitations, which are reflected in its reliance on streaming ETL (real-time computing) modules. It's like a high-performance car that, while capable of driving, must rely on a specific fuel to function. If there is a problem or the performance of the streaming ETL module is insufficient, the effect of the entire data service and real-time data access may be affected, which in turn will adversely affect the data processing and business operation of the enterprise.

Still don't know what a data service is? Bookmark this data service tool!

3. Data Orchestration

Through visual development, users can realize real-time flexible orchestration of a small amount of data on and off the cloud and the construction of process applications. From the perspective of product integration, while experiencing data development services, it can also be used with some no-code application building platforms (such as Jiandao Cloud) to receive data in real time for custom processing.

Still don't know what a data service is? Bookmark this data service tool!

4. Subscribe to and push data

One thing to be clear is that the data consumption needs of users or data consumers are diversified, and for data consumers, the advantage of data services lies in one word: convenience. A simple configuration of data services can meet their own needs, and for data providers, the timing of data release is completely at the discretion of the data provider. This mechanism can be seen as an effective control over data. At the same time, if users have real-time data requirements, they can also meet the needs of business systems such as large screens through simple configuration of data services.

However, it is worth noting that in terms of real-time data push services, the flexible push method still relies on the development of streaming ETL (real-time computing) modules. In other words, if the streaming ETL module is not well developed, it may lead to a series of problems, including delays, data loss, or inaccurate pushes of real-time data, which will naturally affect the use of real-time data by data consumers and the normal operation of business systems.

Still don't know what a data service is? Bookmark this data service tool!

In short, these four sub-scenarios of data services are related and complementary to each other, and together constitute a complete system of data services.

3. Recommendation of data service tools

There are actually a lot of tools related to data services on the market, such as FineDataLink, Tableau, Informatica PowerCenter, etc., today's article will not give you an inventory, first focus on introducing to you a tool that I usually use - FineDataLink, in fact, the application scenarios of FineDataLink are nothing more than 2:

  • According to enterprise security regulations, direct connection to business libraries is not allowed, the code development interface is inefficient, and manual transmission is prone to errors.
  • In the absence of a secure data sharing mechanism, IT has a tendency to reinvent the wheel as the number of data consumers increases.

FDL's data service provides data sharing capabilities, and encapsulates and publishes processed and fused data as normalized API interface data for external systems to call.

FDL Uses Links: "Links"

Still don't know what a data service is? Bookmark this data service tool!

Let's show you how to use FineDataLink to call published APIs from the perspective of a data provider:

1. Preparation

The preparation is that the caller needs to be given permission to use the API. First of all, you need permission to call.

2. FDL calls to the generated API

1) Click "Data Development" to create a scheduled task.

2) Since the API created in section 2.3 is to extract the data with the consignor's region as the parameter area from the "S Order Data" table in the demotest database, we need to assign a value to the area first.

Click "Parameter List" to generate the parameter area, and the value is "East China". As shown in the figure below:

Still don't know what a data service is? Bookmark this data service tool!

3) Drag into the "Data Synchronization" node, enter API related information, and call the interface. As shown in the figure below:

The body content can be directly copied from the body content in section 2.3.3 of this article, and the value of the area can be changed to ${area}.

Still don't know what a data service is? Bookmark this data service tool!

Click "Data Preview", as shown in the figure below:

Still don't know what a data service is? Bookmark this data service tool!

If you enter data in response body processing, you can parse the data field. As shown in the figure below:

You can also drag and drop the "Data Conversion" node, drag the "API Input" operator to get the API data, and then drag the "JSON Analysis" operator to parse the data field.

Still don't know what a data service is? Bookmark this data service tool!

Through the analysis of the concept of data services and the demonstration of data service tools, I believe you have a deeper understanding of the concept of data services. https://s.fanruan.com/ieied Sail Soft Pass Login

Read on