Ahorra un 25 % (o incluso más) en tus costes de Kafka | Acepta el reto del ahorro con Kafka de Confluent
REST API stands for Representational State Transfer. Learn more about REST API, how it simplifies server communication, and how it leverages large-scale data.
A Representational State Transfer, or REST, is a standard that guides the design and development of the processes that allow us to interact with (create, read, update, and delete) data stored on web servers. In other words, REST simplifies communication with data on a server by providing hypertext transfer protocol (HTTP) methods that can be used to perform various interactions with data on web servers.
With REST, you can communicate with servers using HTTP protocol. Because this is designed for usage on the largest scale—the Internet—the coupling between the client and the origin server has to be as loose as possible to allow for the scale.
There are six elements that guide REST. When applied to the system’s architecture, these impart performance, scalability, simplicity, modifiability, visibility, portability, and reliability:
The client is separated from the servers by a well-defined interface.
The client does not consume server storage when it isn’t processing a request.
Responses indicate a server’s own cacheability in order to reduce latency.
Interfaces are consistent in design.
The client cannot tell whether it is connected directly to the end server or to an intermediary on the path to the end server.
The servers are able to temporarily extend or customize the functionality of a client as needed by transferring logic to the client that can be executed within a standard virtual machine (VM).
An Application Programming Interface, or API, establishes a connection between programs, so that they can transfer data. If a program has an API, some of its data is used by the front-end of the program or another application entirely.
In order for this data to be accessed and used, a request must be sent to the API. If the request meets the correct parameters, the API responds with the requested data, usually in JSON or XML format. An API includes documentation that outlines what data is available from it and how to structure your request for that data.
In other words, the API is a courier of data that requires the right instructions to deliver. Or, to use a real-world example, the API is a server in a restaurant. When you order from the menu, the server puts your request into the kitchen and carries your food to your table.
The following are JavaScript examples of REST API requests to the fictional “SamplePlaceholder” API.
fetch(‘https://sampleplaceholder.example.com/todos/1’)
.then(response => response.json())
.then(json => console.log(json))
In the above GET example, we are requesting one to-do item to be returned in JSON format.
fetch("https://sampleplaceholder.example.com/todos", {
method: "POST",
body: JSON.stringify({
userId: 1,
title: "Fix bugs",
completed: false
}),
headers: {
"Content-type": "application/json; charset=UTF-8"
}
});
In the above POST example, the body contains the data to be sent to the server and added to the SamplePlaceholder to-dos API. The headers hold the type of content you want to send to the server, which in this case, is JSON data. After sending this, you would use the fetch command to verify that the new data appears as expected.
fetch('https://sampleplaceholder.example.com/posts/2', {
method: 'PATCH',
body: JSON.stringify({
title: 'new todo',
}),
headers: {
'Content-type': 'application/json; charset=UTF-8',
},
})
.then((response) => response.json())
.then((json) => console.log(json));
In the above PATCH example, the command is to update a page with a new title. The second parameter is used to define the body (data to be sent) and the type of request to be sent, while the third parameter is the header that specifies the type of data you will send, for example, JSON.
fetch('https://sampleplaceholder.example.com/posts/5', {
method: 'DELETE',
});
In the above DELETE example, the command is to permanently remove the page titled “5,” found in posts.
Confluent’s cloud-native solution, the 10x Apache Kafka® service, is elastic, resilient, and performant. Powered by the Kora Engine, it manages 30,000+ fully managed clusters with 99% uptime SLA. With 70+ fully managed connectors, it enables you to connect in any app and system so you can rapidly build, test, and deploy streaming data pipelines.
Confluent Cloud for Apache Flink® provides a REST API for managing your Flink SQL statements and compute pools programmatically. And Confluent Cloud provides REST APIs for cluster management and the production of records to topics, for greater flexibility in how you manage your cluster.
If you’re ready to put Confluent Cloud to work and take advantage of our REST APIs for better data streaming management, get started for free.