One place for hosting & domains

      Using cURL with RESTful APIs


      In web programming, developers often have to interact with online databases. Many of these services provide a
      Representational State Transfer (REST) API that allows authorized users to read and write data. Fortunately, the
      cURL application allows users to easily access REST APIs from the command line. This guide discusses how to use cURL to interrogate RESTful APIs. It also explains how curl, the command-line utility, uses RESTful verbs, and how to inspect headers and add authorization to requests.

      An Introduction to Using cURL with RESTful APIs

      What is cURL?

      cURL stands for “Client URL” and is a data transfer application. It consists of two components, the libcurl client-side library and the curl command-line tool. cURL was originally designed to allow Linux IRC users to automate common tasks. However, it is now available for most operating systems and behaves similarly across platforms.

      Note

      cURL is the complete data transfer application, including the library, while curl is the command-line utility. The two terms are often used interchangeably. This guide mainly discusses the curl utility, which transmits commands directly to a remote REST API.

      curl uses the libcurl library and a simple URL-based syntax to transmit and receive data. It can be used as a stand-alone command line application, or inside scripts or web applications. The curl utility is common in embedded applications for vehicles, routers, printers, and audio-visual equipment. It is also used to access REST APIs and to test new APIs.

      The cURL application is:

      • free and open source.
      • portable across operating systems.
      • contains APIs or bindings for over 50 programming languages, including C/C++, Java, and Python.
      • thread safe.

      It also supports:

      • most transfer protocols and web technologies, including HTTP, FTP, SFTP, and SCP.
      • Ipv6 and dual-stack requests.
      • APIs or bindings for over 50 programming languages, including C/C++, Java, and Python.

      What is REST?

      REST is an architecture consisting of best practices and patterns for web development. It is a set of guidelines for developers rather than a true protocol. Websites and applications are considered RESTful if they follow REST principles. REST is now the industry-standard model for client-server interactions on the web, and most popular web services are only accessible through REST interfaces. The most important REST guidelines are as follows:

      • Client-server Architecture: Clients and servers are loosely coupled and communicate via an API.
      • Statelessness: Requests are independent and do not rely on the current state of the transaction.
      • Caches: Caches are used for better performance and increased security.
      • Layering: Additional features, such as security protocols, can be added to REST as a separate layer. For example, the user can be authenticated and then the request can be passed to another layer for processing.
      • Uniform interfaces: Clients use well-known URIs to request information. They must identify the specific resource to access and the format to use. The services are not customizable, so clients must use the official generic interface.

      REST principles are straightforward. Clients use a Uniform Resource Identifier (URI) to request information from a server. Inside the message, which is typically sent using HTTP, the client identifies the resources it wants. It can also specify a format for the reply. The server replies with the requested data, in JavaScript Object Notation (JSON), HTML, or XML. A REST request includes the following components:

      • An HTTP method indicating the requested operation, such as GET or PUT.
      • A header, including the media type the sender wants to receive. Some examples are text/css and image/gif.
      • The URI to the resource, including any optional parameters. A client can specify the URI using the formats example.com/products/137 or example.com/products/:id.

      The REST architecture is an industry standard because it offers many advantages. Some of its advantages are as follows:

      • It is scalable, fast, robust, and efficient. REST APIs do not use much bandwidth.
      • It is easy to understand and implement.
      • It promotes modular architecture and good design.
      • Clients and servers are fully decoupled. It is easier to make changes to the API or the internal design and is more secure.
      • It allows many different message formats.

      However, REST cannot process any requests based on the state of the transaction. It also does not guarantee reliability or include any security features. Client applications must implement these features.

      What are RESTful Verbs?

      REST interfaces allow for a fixed set of interactions. Taken together, these operations are known as the RESTful verbs or REST verbs. Each RESTful verb indicates an action on the client-side application.

      Each distinct operation is associated with a specific RESTFul verb and a range of possible status codes. A client like curl must include a RESTful verb inside the HTTP header for each request. The RESTful verbs correspond to the main create, read, update, and delete (CRUD) database operations.

      Here are the main RESTful verbs that allow curl to use a REST API:

      • POST: This RESTful verb creates a new resource on the server. If successful, the POST action returns code 201 for “Created” and provides a link to the new reference. Failure codes include 404 for “Not Found”, or a 409 conflict error if the item already exists.
      • GET: GET is used to retrieve information from the server. It can read an entire list or one specific item, and returns code 200 for “OK” if successful. If the item or collection cannot be found, the server returns code 404.
      • PUT: The PUT REST verb is used to update a specific item. The client must specify all attributes for the item. This method returns the status code 200 when the item is updated. The server returns either 404 for “Not Found” or 405 for “Method Not Allowed” if the update fails.
      • PATCH: This REST verb is similar to PUT. It modifies the item, but only contains the new changes, not the entire item. However, this verb is not considered safe from collisions. It is not recommended and is not used very much.
      • DELETE: The DELETE RESTful verb deletes an entry from the database, although it can also potentially delete the entire collection. It returns code 200 when successful, and code 404 or 405 otherwise.
      • OPTIONS: This verb fetches a list of all available operations.

      For almost all APIs, the POST, PUT, PATCH, and DELETE operations require server authentication. However, many servers allow anonymous GET operations for public data. If the server cannot authorize a user, it returns the failure code 401 for “Unauthorized”. Failure code 403, or “Forbidden”, is used if the client is not allowed to access the resource.

      Installing curl

      As of 2022, the most recent release of curl is version 7.83.0. curl usually comes pre-installed on Ubuntu and other Linux distributions. To see if curl is already installed, run the curl command with the -V flag for “version”. The local installation might not match the latest edition, but any recent release should be adequate.

      curl -V
      
      curl 7.68.0 (x86_64-pc-linux-gnu) libcurl/7.68.0 OpenSSL/1.1.1f zlib/1.2.11 brotli/1.0.9 libidn2/2.3.0 libpsl/0.21.0 (+libidn2/2.2.0) libssh/0.9.3/openssl/zlib nghttp2/1.40.0 librtmp/2.3
      Release-Date: 2020-01-08
      Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
      Features: AsynchDNS brotli GSS-API HTTP2 HTTPS-proxy IDN IPv6 Kerberos Largefile libz NTLM NTLM_WB PSL SPNEGO SSL TLS-SRP UnixSockets

      If necessary, curl can be installed using apt install. Ensure the system is updated first.

      sudo apt install curl
      

      Documentation for curl can be found on the
      curl website. The source code can be found on the
      curl GitHub page.

      Command Line Options for curl

      To use curl from the command line, type curl and the URL to access.

      curl example.com
      

      By default, curl displays its output in the terminal window. However, the -o option redirects the output to a file.

      curl -o source.html example.com
      

      curl includes a wide range of options. To see a list of all options, use the --help option.

      curl --help
      

      Some of the most important options/flags are as follows:

      • -B: Use ASCII for text and transfer.
      • -C: Resume an interrupted transfer.
      • -d: Data for the HTTP POST or PUT commands.
      • -E: Use a client certificate file and optional password.
      • -F: Update a HTTP form request from a file.
      • -H: Pass a custom header to the server.
      • -K: Use a file for the configuration.
      • -m: Set a maximum time for the transfer.
      • -N: Disable buffering.
      • -o: Write the output to a file.
      • -s: Run in silent mode.
      • -u: Add a user name and password for the server.
      • -v: Verbose mode, for more details.
      • -X: Specifies the HTTP command to use.
      • -4: Use Ipv4 addresses.
      • -6: Use Ipv6 addresses.
      • -#: Display a progress bar. This is useful for large transfers.

      cURL vs wget

      The wget utility is a simpler alternative to curl. wget is a command-line only utility, while the full cURL application includes the libcurl library. This makes it capable of more complicated tasks.

      Some of the similarities and differences between curl and wget are as follows:

      • Both utilities can be used from the command line.
      • They can both use FTP and HTTP and support proxies and cookies.
      • Both curl and wget are free and open source utilities.
      • Both run on a large number of operating systems and are completely portable.
      • Both can transmit HTTP POST and GET requests.
      • wget can be used recursively while curl cannot.
      • wget can automatically recover from a broken transfer. curl must be restarted.
      • curl includes the powerful libcurl API.
      • curl supports more protocols, SSL libraries, and HTTP authentication methods.
      • curl is bidirectional and can do transfers in parallel.
      • curl supports many more security measures, different releases of HTTP, and dual stack IPv4/Ipv6 transfers.

      Either utility is fine for most simple HTTP requests and downloads. If you are familiar with only one of the tools and it is suitable for your requirements, continue to use it. However, wget is only a simple transfer utility. curl is a better all-purpose tool for heavy duty and professional use. See our guide
      How to Use wget to learn more about this pared-down alternative to curl.

      cURL Methods

      curl uses several HTTP commands to connect to remote REST APIs. These actions correspond to the different REST verbs. The syntax for RESTful requests is simple and straightforward and is similar to other curl requests. For thorough documentation on how to use curl, see the official
      curl documentation.

      To determine the URIs to use for each operation, consult the API documentation provided for the tool or service. As an example, the official
      GitHub REST API explains how to use the interface. When designing a REST interface, it is easy to test the API using curl.

      Note

      The following examples use example.com in the instructions. Substitute example.com with your own URI.

      GET

      The GET operation allows curl to receive information from a REST API. To use the GET RESTful verb, use the curl command followed by the name of the resource to access. The -X attribute and the name of the operation are not required because GET is the default HTTP operation.

      The output varies based on the server. It includes a status, which is set to success if the request is valid, the data, and an optional message. In this case, the client does not specify a format for the data, so the server responds using JSON. To see more information about the transfer, including the server options, append the -v (verbose) option to the command.

      curl https://example.com/api/2/employees
      
      {"status":"success","data":[{"id":1,"name":"Tom","age":60,"image":""},
      ...
      {"id":40,"name":"Linda","age":50,"image":""}],"message":"All records retrieved."}

      To see one particular entry, append the id of the entry to retrieve. In this example, only the information for employee 10 is returned from the server. The output is again in JSON format.

      curl https://example.com/api/2/employees/10
      
      {"status":"success","data":{"id":10,"name":"Julia","age":33,"image":""},"message":"Record retrieved."}

      POST

      The POST verb allows users to push data to a REST API and add new entries to the remote database. The data is specified as an argument for the -d option. The data should be in a format matching the request. In this case, the -H option informs the server the data is in application/json format. If a format is not specified, curl adds Content-Type: application/x-www-form-urlencoded to the HTTP header. This might cause problems on some servers.

      The server returns the new record, including the id of the new entry. The following command adds a new record to the application server.

      Note

      The curl command infers this is a POST operation based on the other details. But it is considered good practice to explicitly state the verb as part of the -X option.

      curl -d '{"name":"Jamie","age":"23","image":""}' -H 'Content-Type: application/json' -X POST https://example.com/api/2/create
      
      {"status":"success","data":{"name":"Jamie","age":"23","image":null,"id":5126},"message":"Record added."}

      This approach is fine for small amounts of data. To add multiple records, pass a file containing the information to the server. The filename can be indicated with a @ symbol followed by the file name, as follows:

      curl -d @data.json -H 'Content-Type: application/json' -X POST https://example.com/api/2/create
      

      PUT

      The RESTful verb PUT modifies an existing entry. This option works similarly to the POST option. The -d flag specifies the updated information for the record, and -H indicates the data format. However, the id of the record to update must be included as part of the URI. For a PUT command, the -X option must include the keyword.

      curl -d '{"name":"Jamie","age":"23","image":""}' -H 'Content-Type: application/json' -X PUT  https://example.com/api/2/update/31
      
      {"status":"success","data":{"name":"Jamie","age":"23","image":null},"message":"Record updated."}

      DELETE

      The DELETE operation removes a record from the database. It is one of the simpler REST verbs to use. As part of the -X option, include the DELETE verb and append the id of the record to delete to the URI. The data and header flags are not required for this operation.

      curl -X DELETE https://example.com/api/2/delete/31
      
      {"status":"success","data":"31","message":"Record deleted"}

      Viewing and Changing Headers with cURL

      In normal usage, curl only displays the most relevant information, not the entire HTTP request and response. To view all information, including the HTTP headers, add the -v option to any curl command to activate verbose mode.

      curl -v example.com
      
      * TCP_NODELAY set
      * Connected to example.com (2606:2800:220:1:248:1893:25c8:1946) port 80 (#0)
      > GET / HTTP/1.1
      > Host: example.com
      > User-Agent: curl/7.68.0
      > Accept: */*
      >
      * Mark bundle as not supporting multiuse
      < HTTP/1.1 200 OK
      < Age: 409433
      < Cache-Control: max-age=604800
      < Content-Type: text/html; charset=UTF-8
      < Date: Tue, 03 May 2022 16:40:30 GMT
      < Etag: "3147526947+ident"
      < Expires: Tue, 10 May 2022 16:40:30 GMT
      < Last-Modified: Thu, 17 Oct 2019 07:18:26 GMT
      < Server: ECS (bsa/EB20)
      < Vary: Accept-Encoding
      < X-Cache: HIT

      Any outgoing HTTP header in curl can be modified using the -H option. Some of the previous examples already demonstrated how to use this flag when setting the content-type. However, -H also allows users to modify any field in the header. The following example demonstrates how to turn off the user-agent field in the header. When the header is reviewed in verbose mode, the field is no longer present.

      curl -H "User-Agent:" http://example.com -v
      
      *   Trying 2606:2800:220:1:248:1893:25c8:1946:80...
      * TCP_NODELAY set
      * Connected to example.com (2606:2800:220:1:248:1893:25c8:1946) port 80 (#0)
      > GET / HTTP/1.1
      > Host: example.com
      > Accept: */*

      Authorization and Passwords with cURL

      Many REST APIs require the user to authenticate using a valid user name and password. The easiest way to provide this information is through the -u option of the curl command. Include the account name and password, separated by a :. The following example executes the GET RESTful verb using authentication.

      curl -u user:password https://example.com/api/2/employee/10
      

      Conclusion

      Although it is best known as a data transfer application, the cURL application can interact with REST APIs. It includes the curl command line utility and the fully-featured libcurl library. REST is a popular architecture for client-server applications. It decouples the two components and stresses modularity and efficiency. Information is exchanged through well-known URIs.

      Users can access REST APIs using the RESTful verbs, which correspond to the basic HTTP actions. curl can send all common HTTP commands to a REST API including GET, POST, PUT, and DELETE. The curl utility is straightforward to use. It has a few main options for data transmission, user authentication, and making header changes. For more information about curl, see the
      curl documentation.

      More Information

      You may wish to consult the following resources for additional information
      on this topic. While these are provided in the hope that they will be
      useful, please note that we cannot vouch for the accuracy or timeliness of
      externally hosted materials.



      Source link

      A Guide to API Formats: The Different Types of APIs


      APIs are what keep software connected. Whether you are looking to link your application to others or you want to have smooth communication between services, APIs help bring multiple pieces of an application together.

      Applications and services can be connected in myriad ways, depending on access limitations and communication protocols. APIs have developed several different approaches for making connections to support modern application architectures.

      In this tutorial, learn about what APIs are, the types of APIs that are available, and the various protocols they can use to communicate.

      What is an API?

      An API — short for Application Programming Interface — defines a set of rules by which applications and services can interact.

      APIs are used in a wide variety of contexts. However, often, when people talk about APIs, they are talking about web APIs. These APIs allow for communication between applications and services using the
      HTTP protocol
      .

      Often, web APIs are used for web application servers and web browsers to communicate. However, you may also see web APIs used for communication between different web servers, or between applications on the same server. You may even see web APIs at work between different services acting as parts of the same application. One example of an API enabling communication between different services of the same application, is Kubernetes. The
      Kubernetes API
      is the linchpin to its powerful orchestration system.

      The Four Main Types of APIs

      APIs come in four different modalities. Each of these covers a different access level or, in the case of web APIs, a different usage.

      Which one of these you use depends on your API’s particular needs. The sections below provide descriptions of each kind of API and they can help you decide which is best for your use case. Each section also provides context and examples to make it easier to see how each API model can fit into different use cases.

      Open APIs

      Open APIs, or public APIs, come with limited or no access restrictions. This essentially allows any developer to make requests to these APIs.

      These APIs may have some limits. A developer may have to register an account to receive an API key, for instance. Additionally, limits may be placed on things like the number of requests in a given time frame.

      But overall, open APIs are distinguished by being intended for widespread external use. They are meant for third-party developers to be able to access and make use of the API as they need.

      An example of open APIs are those provided by
      NASA
      . After completing a simple registration for an API key, NASA gives you access to numerous open APIs. NASA’s open APIs include everything from Earth observation images to information about the weather on Mars.

      When to Use an Open API?

      Make your API open when you intend it for public consumption. Open APIs are especially useful when you have information or services you want to make available to the general public.

      These APIs are often used for open source projects and for the dissemination of public knowledge, like NASA and other government agencies.

      Partner APIs

      Partner APIs require authorization of some kind to use. They still allow external access, but are not intended for the general public to have access to. Instead, partner APIs are designed for use by pre-approved individuals, teams, or organizations.

      A partner API may allow public access through a paid subscription or it may limit access to developers with a business relationship. Typically, the developer has an API key, as with open APIs, that require registration. But with partner APIs, keys tend to be given out more sparingly and with more access restrictions.

      An example of a partner API is one that allows two companies to work together. Company A may have an application which Company B has agreed to provide services for. Developers at Company A receive API keys which they can use to access Company B’s API. This allows Company A’s application to make use of Company B’s services while keeping access to these services limited.

      When to Use a Partner API?

      Make your API a partner API when it needs to be accessed externally but that access needs to be limited to authorized users. Partner APIs are ideal for business-to-business services or for subscription-based APIs.

      You are likely to see partner APIs in companies that make use of external services for parts of an application’s functionality. Often, this can be a preferred solution compared to developing services in house. It allows companies to integrate features that have been developed by experts elsewhere into their applications. At the same time, it lets the external experts retain control of their services.

      Internal APIs

      Internal APIs, also called private APIs, disallow external access. Instead, these APIs can only be accessed by developers within a company or even within the particular application to which the API belongs.

      These APIs are the most limited. APIs are incredibly useful in defining communication between applications and services, and this even applies when communication is within a single organization.

      A simple example of an internal API use case is a company that has two applications for selling items. One application allows customers to purchase items directly; the other allows sales personnel to process sales. Both applications need access to the inventory. The company could have both applications independently access the inventory database. However, doing so would likely lead to more difficult and inconsistent maintenance.

      So, instead, the company has an internal API for managing inventory. Both the customer-facing and sales-personnel applications can access this API to view and update inventory. Updates to each application can be made independently, as long as each adheres to the rules of the API.

      When to Use an Internal API?

      Make your API internal when you want to restrict access as much as possible. Internal APIs are designed to be private, with only applications and services within your organization having access. An internal API can even be used when different parts of an application need to communicate.

      These APIs are common within enterprise organizations. When applications scale, it helps to define APIs for managing underlying logic. Take the example above, where business logic can be developed and maintained in the customer-facing and sales-personnel applications. This can be done without concern for the impact to the underlying data storage and retrieval tasks, since those are housed in the internal API.

      Composite APIs

      Composite APIs allow for requests to be bundled or chained together, which, in turn, allows developers to receive single responses for request collections.

      These APIs are useful for reducing server load and network traffic when you expect frequent requests to multiple API endpoints. Calls get made less frequently, resulting in reductions to server processing time and the number of requests across the network.

      This makes composite APIs exceptionally effective for microservices. Often, applications built on microservices have to compile information from multiple sources. Having composite APIs that do this makes for more efficient applications.

      To give an example of a composite API in action, think of an online ordering form. When the user completes and submits the form, the application often has to register the user, check and update inventory, and send a confirmation notification. A composite API allows all of these tasks to be handled simultaneously, in a single call.

      When to Use a Composite API?

      Make use of a composite API when your application exposes endpoints that are likely to be called in groups or in quick succession. This is often the case with microservices, where requests and responses frequently need to be combined.

      This type of API can be especially useful when your
      microservice application
      needs to communicate with users’ web browsers. Here, you want to optimize network traffic to reduce load times and improve user experience. You also want to reduce your server load to make your application scalable for a larger number of users.

      What are the Different API Protocol Types?

      Every API uses a particular protocol. An API’s protocol defines the rules for how it can communicate. These rules make explicit the kinds of requests that can be made, what the API’s responses look like, and what kinds of data the API can send and receive.

      There are three main protocols used by web APIs.

      • REST. Short for Representational State Transfer, REST implements stateless APIs with uniform interfaces using HTTP. REST is actually more of a set of architectural principles for APIs than a protocol proper. You can use the
        Flask Python framework
        to build your own REST API.

      • SOAP. The Simple Object Access Protocol uses XML for requests and responses and maintains strict definitions for messages. SOAP is highly adaptable, designed to be neutral, and applicable in many contexts, not just for web APIs. It can even be used in conjunction with REST principles.

      • RPC. Simpler than both REST and SOAP, the Remote Procedural Call protocol focuses on actions taken on a server. This is in contrast to both REST and SOAP, which tend to focus on server resources. RPC works primarily on running processes. Often, RPC APIs execute scripts on the server.

      Conclusion

      This guide has walked you through the basics of APIs, explaining the different categories they fit into and the contexts they are used in. The four main types of APIs are open, partner, internal, and composite. The guide also covered the protocols web APIs use to send and receive messages. These API protocols are REST, SOAP, and RPC. You now have a strong foundation for entering into the world of web APIs. It is a wide and fast-moving world.



      Source link

      How To Use Wget to Download Files and Interact with REST APIs


      The author selected the COVID-19 Relief Fund to receive a donation as part of the Write for DOnations program.

      Introduction

      Wget is a networking command-line tool that lets you download files and interact with REST APIs. It supports the HTTP,HTTPS, FTP, and FTPS internet protocols. Wget can deal with unstable and slow network connections. In the event of a download failure, Wget keeps trying until the entire file has been retrieved. Wget also lets you resume a file download that was interrupted without starting from scratch.

      You can also use Wget to interact with REST APIs without having to install any additional external programs. You can make GET, POST, PUT, and DELETE HTTP requests with single and multiple headers right in the terminal.

      In this tutorial, you will use Wget to download files, interact with REST API endpoints, and create and manage a Droplet in your DigitalOcean account.

      To follow along with this tutorial using a terminal in your browser, click the Launch an Interactive Terminal! button below:

      Launch an Interactive Terminal!

      Otherwise, if you’d like to use your local system or a remote server, open a terminal and run the commands there.

      Prerequisites

      To complete this tutorial, you will need:

      • Wget installed. Most Linux distributions have Wget installed by default. To check, type wget in your terminal and press ENTER. If it is not installed, it will display: command not found. You can install it by running the following command: sudo apt-get install wget.

      • A DigitalOcean account. If you do not have one, sign up for a new account.

      • A DigitalOcean Personal Access Token, which you can create via the DigitalOcean control panel. Instructions to do that can be found here: How to Generate a Personal Access Token.

      Downloading Files

      In this section, you will use Wget to customize your download experience. For example, you will learn to download a single file and multiple files, handle file downloads in unstable network conditions, and, in the case of a download interruption, resume a download.

      First, create a directory to save the files that you will download throughout this tutorial:

      • mkdir -p DigitalOcean-Wget-Tutorial/Downloads

      With the command above, you have created a directory named DigitalOcean-Wget-Tutorial, and inside of it, you created a subdirectory named Downloads. This directory and its subdirectory will be where you will store the files you download.

      Navigate to the DigitalOcean-Wget-Tutorial directory:

      • cd DigitalOcean-Wget-Tutorial

      You have successfully created the directory where you will store the files you download.

      Downloading a file

      In order to download a file using Wget, type wget followed by the URL of the file that you wish to download. Wget will download the file in the given URL and save it in the current directory.

      Let’s download a minified version of jQuery using the following command:

      • wget https://code.jquery.com/jquery-3.6.0.min.js

      Don’t worry if you don’t know what jQuery is – you could have downloaded any file available on the internet. All you need to know is that you successfully used Wget to download a file from the internet.

      The output will look similar to this:

      Output

      --2021-07-21 16:25:11-- https://code.jquery.com/jquery-3.6.0.min.js Resolving code.jquery.com (code.jquery.com)... 69.16.175.10, 69.16.175.42, 2001:4de0:ac18::1:a:1a, ... Connecting to code.jquery.com (code.jquery.com)|69.16.175.10|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 89501 (87K) [application/javascript] Saving to: ‘jquery-3.6.0.min.js’ jquery-3.6.0.min.js 100%[===================>] 87.40K 114KB/s in 0.8s 2021-07-21 16:25:13 (114 KB/s) - ‘jquery-3.6.0.min.js’ saved [89501/89501]

      According to the output above, you have successfully downloaded and saved a file named jquery-3.6.0.min.js to your current directory.

      You can check the contents of the current directory using the following command:

      The output will look similar to this:

      Output

      Downloads jquery-3.6.0.min.js

      Specifying the filename for the downloaded file

      When downloading a file, Wget defaults to storing it using the name that the file has on the server. You can change that by using the -O option to specify a new name.

      Download the jQuery file you downloaded previously, but this time save it under a different name:

      • wget -O jquery.min.js https://code.jquery.com/jquery-3.6.0.min.js

      With the command above, you set the jQuery file to be saved as jquery.min.js instead of jquery-3.6.0.min.js

      The output will look similar to this:

      Output

      --2021-07-21 16:27:01-- https://code.jquery.com/jquery-3.6.0.min.js Resolving code.jquery.com (code.jquery.com)... 69.16.175.10, 69.16.175.42, 2001:4de0:ac18::1:a:2b, ... Connecting to code.jquery.com (code.jquery.com)|69.16.175.10|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 89501 (87K) [application/javascript] Saving to: ‘jquery.min.js’ jquery.min.js 100%[==================================>] 87.40K 194KB/s in 0.4s 2021-07-21 16:27:03 (194 KB/s) - ‘jquery.min.js’ saved [89501/89501]

      According to the output above, you have successfully downloaded the jQuery file and saved it as jquery.min.js.

      You can use the ls command to list the contents of your current directory, and you will see the jquery.min.js file there:

      The output will look similar to this:

      Output

      Downloads jquery-3.6.0.min.js jquery.min.js

      So far, you have used wget to download files to the current directory. Next, you will download to a specific directory.

      Downloading a file to a specific directory

      When downloading a file, Wget stores it in the current directory by default. You can change that by using the -P option to specify the name of the directory where you want to save the file.

      Download the jQuery file you downloaded previously, but this time save it in the Downloads subdirectory.

      • wget -P Downloads/ https://code.jquery.com/jquery-3.6.0.min.js

      The output will look similar to this:

      Output

      --2021-07-21 16:28:50-- https://code.jquery.com/jquery-3.6.0.min.js Resolving code.jquery.com (code.jquery.com)... 69.16.175.42, 69.16.175.10, 2001:4de0:ac18::1:a:2b, ... Connecting to code.jquery.com (code.jquery.com)|69.16.175.42|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 89501 (87K) [application/javascript] Saving to: ‘Downloads/jquery-3.6.0.min.js’ jquery-3.6.0.min.js 100%[==================================>] 87.40K 43.6KB/s in 2.0s 2021-07-21 16:28:53 (43.6 KB/s) - ‘Downloads/jquery-3.6.0.min.js’ saved [89501/89501]

      Notice the last line where it says that the jquery-3.6.0.min.js file was saved in the Downloads directory.

      If you use the ls Downloads command to list the contents of the Downloads directory, you will see the jQuery file there:

      Run the ls command:

      The output will look similar to this:

      Output

      jquery-3.6.0.min.js

      Turning Wget’s output off

      By default, Wget outputs a lot of information to the terminal when you download a file. You can use the -q option to turn off all output.

      Download the jQuery file, but this time without showing any output:

      • wget -q https://code.jquery.com/jquery-3.6.0.min.js

      You won’t see any output, but if you use the ls command to list the contents of the current directory you will find a file named jquery-3.6.0.min.js.1:

      The output will look similar to this:

      Output

      Downloads jquery-3.6.0.min.js jquery-3.6.0.min.js.1 jquery.min.js

      Before saving a file, Wget checks whether the file exists in the desired directory. If it does, Wget adds a number to the end of the file. If you ran the command above one more time, Wget would create a file named jquery-3.6.0.min.js.2. This number increases every time you download a file to a directory that already has a file with the same name.

      You have successfully turned off Wget’s output, but now you can’t monitor the download progress. Let’s look at how to show the download progress bar.

      Showing the download progress bar

      Wget lets you show the download progress bar but hide any other output by using the -q option alongside the --show-progress option.

      Download the jQuery file, but this time only show the download progress bar:

      • wget -q --show-progress https://code.jquery.com/jquery-3.6.0.min.js

      The output will look similar to this:

      Output

      jquery-3.6.0.min.js.2 100%[================================================>] 87.40K 207KB/s in 0.4s

      Use the ls command to check the contents of the current directory and you will find the file you have just downloaded with the name jquery-3.6.0.min.js.2

      From this point forward you will be using the -q and --show-progress options in most of the subsequent Wget commands.

      So far you have only downloaded a single file. Next, you will download multiple files.

      Downloading multiple files

      In order to download multiples files using Wget, you need to create a .txt file and insert the URLs of the files you wish to download. After inserting the URLs inside the file, use the wget command with the -i option followed by the name of the .txt file containing the URLs.

      Create a file named images.txt:

      In images.txt, add the following URLs:

      images.txt

      https://cdn.pixabay.com/photo/2016/12/13/05/15/puppy-1903313__340.jpg
      https://cdn.pixabay.com/photo/2016/01/05/17/51/maltese-1123016__340.jpg
      https://cdn.pixabay.com/photo/2020/06/30/22/34/dog-5357794__340.jpg
      

      The URLs link to three random images of dogs found on Pixabay. After you have added the URLs, save and close the file.

      Now you will use the -i option alongside the -P,-q and --show-progress options that you learned earlier to download all three images to the Downloads directory:

      • wget -i images.txt -P Downloads/ -q --show-progress

      The output will look similar to this:

      Output

      puppy-1903313__340.jp 100%[=========================>] 26.44K 93.0KB/s in 0.3s maltese-1123016__340. 100%[=========================>] 50.81K --.-KB/s in 0.06s dog-5357794__340.jpg 100%[=========================>] 30.59K --.-KB/s in 0.07s

      If you use the ls Downloads command to list the contents of the Downloads directory, you will find the names of the three images you have just downloaded:

      The output will look similar to this:

      Output

      dog-5357794__340.jpg jquery-3.6.0.min.js maltese-1123016__340.jpg puppy-1903313__340.jpg

      Limiting download speed

      So far, you have download files with the maximum available download speed. However, you might want to limit the download speed to preserve resources for other tasks. You can limit the download speed by using the --limit-rate option followed by the maximum speed allowed in kiloBits per second and the letter k.

      Download the first image in the images.txt file with a speed of 15 kB/S to the Downloads directory:

      • wget --limit-rate 15k -P Downloads/ -q --show-progress https://cdn.pixabay.com/photo/2016/12/13/05/15/puppy-1903313__340.jpg

      The output will look similar to this:

      Output

      puppy-1903313__340.jpg.1 100%[====================================================>] 26.44K 16.1KB/s in 1.6s

      If you use the ls Downloads command to check the contents of the Downloads directory, you will see the file you have just downloaded with the name puppy-1903313__340.jpg.1.

      When downloading a file that already exists, Wget creates a new file instead of overwriting the existing file. Next, you will overwrite a downloaded file.

      Overwriting a downloaded file

      You can overwrite a file you have downloaded by using the -O option alongside the name of the file. In the code below, you will first download the second image listed in the images.txt file to the current directory and then you will overwrite it.

      First, download the second image to the current directory and set the name to image2.jpg:

      • wget -O image2.jpg -q --show-progress https://cdn.pixabay.com/photo/2016/12/13/05/15/puppy-1903313__340.jpg

      The output will look similar to this::

      Output

      image2.jpg 100%[====================================================>] 26.44K --.-KB/s in 0.04s

      If you use the ls command to check the contents of the current directory, you will see the file you have just downloaded with the name image2.jpg.

      If you wish to overwrite this image2.jpg file, you can run the same command you ran earlier :

      • wget -O image2.jpg -q --show-progress https://cdn.pixabay.com/photo/2016/12/13/05/15/puppy-1903313__340.jpg

      You can run the command above as many times as you like and Wget will download the file and overwrite the existing one. If you run the command above without the -O option, Wget will create a new file each time you run it.

      Resuming a download

      Thus far, you have successfully downloaded multiple files without interruption. However, if the download was interrupted, you can resume it by using the -c option.

      Run the following command to download a random image of a dog found on Pixabay. Note that in the command, you have set the maximum speed to 1 KB/S. Before the image finishes downloading, press Ctrl+C to cancel the download:

      • wget --limit-rate 1k -q --show-progress https://cdn.pixabay.com/photo/2018/03/07/19/51/grass-3206938__340.jpg

      To resume the download, pass the -c option. Note that this will only work if you run this command in the same directory as the incomplete file:

      • wget -c --limit-rate 1k -q --show-progress https://cdn.pixabay.com/photo/2018/03/07/19/51/grass-3206938__340.jpg

      Up until now, you have only downloaded files in the foreground. Next, you will download files in the background.

      Downloading in the background

      You can download files in the background by using the -b option.

      Run the command below to download a random image of a dog from Pixabay in the background:

      • wget -b https://cdn.pixabay.com/photo/2018/03/07/19/51/grass-3206938__340.jpg

      When you download files in the background, Wget creates a file named wget-log in the current directory and redirects all output to this file. If you wish to watch the status of the download, you can use the following command:

      The output will look similar to this:

      Output

      Resolving cdn.pixabay.com (cdn.pixabay.com)... 104.18.20.183, 104.18.21.183, 2606:4700::6812:14b7, ... Connecting to cdn.pixabay.com (cdn.pixabay.com)|104.18.20.183|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 33520 (33K) [image/jpeg] Saving to: ‘grass-3206938__340.jpg’ 0K .......... .......... .......... .. 100% 338K=0.1s 2021-07-20 23:49:52 (338 KB/s) - ‘grass-3206938__340.jpg’ saved [33520/33520]

      Setting a timeout

      Until this point, we have assumed that the server that you are trying to download files from is working properly. However, let’s assume that the server is not working properly. You can use Wget to first limit the amount of time that you wait for the server to respond and then limit the number of times that Wget tries to reach the server.

      If you wish to download a file but you are unsure if the server is working properly, you can set a timeout by using the -T option followed by the time in seconds.

      In the following command, you are setting the timeout to 5 seconds:

      • wget -T 5 -q --show-progress https://cdn.pixabay.com/photo/2016/12/13/05/15/puppy-1903313__340.jpg

      Setting maximum number of tries

      You can also set how many times Wget attempts to download a file after being interrupted by passing the --tries option followed by the number of tries.

      By running the command below, you are limiting the number of tries to 3:

      • wget --tries=3 -q --show-progress https://cdn.pixabay.com/photo/2018/03/07/19/51/grass-3206938__340.jpg

      If you would like to try indefinitely you can pass inf alongside the --tries option:

      • wget --tries=inf -q --show-progress https://cdn.pixabay.com/photo/2018/03/07/19/51/grass-3206938__340.jpg

      In this section, you used Wget to download a single file and multiple files, resume downloads, and handle network issues. In the next section, you will learn to interact with REST API endpoints.

      Interacting with REST APIs

      In this section, you will use Wget to interact with REST APIs without having to install an external program. You will learn the syntax to send the most commonly used HTTP methods: GET, POST, PUT, and DELETE.

      We are going to use JSONPlaceholder as the mock REST API. JSONPlaceholder is a free online REST API that you can use for fake data. (The requests you send to it won’t affect any databases and the data won’t be saved.)

      Sending GET requests

      Wget lets you send GET requests by running a command that looks like the following:

      In the command above, the - after the -O option means standard output, so Wget will send the output of the URL to the terminal instead of sending it to a file as you did in the previous section. GET is the default HTTP method that Wget uses.

      Run the following command in the terminal window:

      • wget -O- https://jsonplaceholder.typicode.com/posts?_limit=2

      In the command above, you used wget to send a GET request to JSON Placeholder in order to retrieve two posts from the REST API.

      The output will look similar to this:

      Output

      --2021-07-21 16:52:51-- https://jsonplaceholder.typicode.com/posts?_limit=2 Resolving jsonplaceholder.typicode.com (jsonplaceholder.typicode.com)... 104.21.10.8, 172.67.189.217, 2606:4700:3032::6815:a08, ... Connecting to jsonplaceholder.typicode.com (jsonplaceholder.typicode.com)|104.21.10.8|:443... connected. HTTP request sent, awaiting response... 200 OK' Length: 600 [application/json] Saving to: ‘STDOUT’ - 0%[ ] 0 --.-KB/s [ { "userId": 1, "id": 1, "title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit", "body": "quia et suscipitnsuscipit recusandae consequuntur expedita et cumnreprehenderit molestiae ut ut quas totamnnostrum rerum est autem sunt rem eveniet architecto" }, { "userId": 1, "id": 2, "title": "qui est esse", "body": "est rerum tempore vitaensequi sint nihil reprehenderit dolor beatae ea dolores nequenfugiat blanditiis voluptate porro vel nihil molestiae ut reiciendisnqui aperiam non debitis possimus qui neque nisi nulla" } - 100%[==================================>] 600 --.-KB/s in 0s 2021-07-21 16:52:53 (4.12 MB/s) - written to stdout [600/600]

      Notice the line where it says HTTP request sent, awaiting response... 200 OK, which means that you have successfully sent a GET request to JSONPlaceholder.

      If that is too much output you can use the -q option that you learned in the previous section to restrict the output to the results of the GET request:

      • wget -O- -q https://jsonplaceholder.typicode.com/posts?_limit=2

      The output will look similar to this:

      Output

      [ { "userId": 1, "id": 1, "title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit", "body": "quia et suscipitnsuscipit recusandae consequuntur expedita et cumnreprehenderit molestiae ut ut quas totamnnostrum rerum est autem sunt rem eveniet architecto" }, { "userId": 1, "id": 2, "title": "qui est esse", "body": "est rerum tempore vitaensequi sint nihil reprehenderit dolor beatae ea dolores nequenfugiat blanditiis voluptate porro vel nihil molestiae ut reiciendisnqui aperiam non debitis possimus qui neque nisi nulla" } ]

      Sending POST requests

      Wget lets you send POST requests by running a command that looks like the following:

      • wget --method==[post] -O- --body-data=[ body in json format ] --header=[ String ] [ URL ]

      Run the following command:

      • wget --method=post -O- -q --body-data="{"title": "Wget POST","body": "Wget POST example body","userId":1}" --header=Content-Type:application/json https://jsonplaceholder.typicode.com/posts

      In the command above, you used wget to send a POST request to JSON Placeholder to create a new post. You set the method to post, the Header to Content-Type:application/json and sent the following request body to it :{"title": "Wget POST","body": "Wget POST example body","userId":1}.

      The output will look similar to this:

      Output

      { "title": "Wget POST", "body": "Wget POST example body", "userId": 1, "id": 101 }

      Sending PUT requests

      Wget lets you send PUT requests by running a command that looks like the following:

      • wget --method==[put] -O- --body-data=[ body in json format ] --header=[ String ] [ URL ]

      Run the following command:

      • wget --method=put -O- -q --body-data="{"title": "Wget PUT", "body": "Wget PUT example body", "userId": 1, "id":1}" --header=Content-Type:application/json https://jsonplaceholder.typicode.com/posts/1

      In the command above you used wget to send a PUT request to JSON Placeholder to edit the first post in this REST API. You set the method to put, the Header to Content-Type:application/json and sent the following request body to it :{"title": "Wget PUT", "body": "Wget PUT example body", "userId": 1, "id":1} .

      The output will look similar to this:

      Output

      { "body": "Wget PUT example body", "title": "Wget PUT", "userId": 1, "id": 1 }

      Sending DELETE requests

      Wget lets you send DELETE requests by running a command that looks like the following:

      • wget --method==[delete] -O- [ URL ]

      Run the following command:

      • wget --method=delete -O- -q --header=Content-Type:application/json https://jsonplaceholder.typicode.com/posts/1

      In the command above you used wget to send a DELETE request to JSON Placeholder to delete the first post in this REST API. You set the method to delete, and set the post you want to delete to 1 in the URL.

      The output will look similar to this:

      Output

      {}

      In this section, you learned how to use Wget to send GET, POST, PUT and DELETE requests with only one header field. In the next section, you will learn how to send multiple header fields in order to create and manage a Droplet in your DigitalOcean account.

      Creating and Managing a DigitalOcean Droplet

      In this section, you will apply what you learned in the previous section and use Wget to create and manage a Droplet in your DigitalOcean account. But before you do that, you will learn how to send multiple headers fields in a HTTP method.

      The syntax for a command to send multiple headers looks like this:

      • wget --header=[ first header ] --header=[ second header] --header=[ N header] [ URL ]

      You can have as many headers fields as you like by repeating the --header option as many times as you need.

      To create a Droplet or interact with any other resource in the DigitalOcean API, you will need to send two request headers:

      Content-Type: application/json
      Authorization: Bearer your_personal_access_token
      

      You already saw the first header in the previous section. The second header is what lets you authenticate your account. It has the String named Bearer followed by your DigitalOcean account Personal Access Token.

      Run the following command, replacing your_personal_access_token with your DigitalOcean Personal Access Token:

      • wget --method=post -O- -q --header="Content-Type: application/json" --header="Authorization: Bearer your_personal_access_token" --body-data="{"name":"Wget-example","region":"nyc1","size":"s-1vcpu-1gb","image":"ubuntu-20-04-x64","tags": ["Wget-tutorial"]}" https://api.digitalocean.com/v2/droplets

      With the command above, you have created an ubuntu-20-04-x64 Droplet in the nyc1 region named Wget-example with 1vcpu and 1gb of memory, and you have set the tag to Wget-tutorial. For more information about the attributes in the body-data field, see the DigitalOcean API documentation.

      The output will look similar to this:

      Output

      {"droplet":{"id":237171073,"name":"Wget-example","memory":1024,"vcpus":1,"disk":25,"locked":false,"status":"new","kernel":null,"created_at":"2021-03-16T12:38:59Z","features":[],"backup_ids":[],"next_backup_window":null,"snapshot_ids":[],"image":{"id":72067660,"name":"20.04 (LTS) x64","distribution":"Ubuntu","slug":"ubuntu-20-04-x64","public":true,"regions":["nyc3","nyc1","sfo1","nyc2","ams2","sgp1","lon1","ams3","fra1","tor1","sfo2","blr1","sfo3"],"created_at":"2020-10-20T16:34:30Z","min_disk_size":15,"type":"base","size_gigabytes":0.52,"description":"Ubuntu 20.04 x86","tags":[],"status":"available"},"volume_ids":[],"size":{"slug":"s-1vcpu-1gb","memory":1024,"vcpus":1,"disk":25,"transfer":1.0,"price_monthly":5.0,"price_hourly":0.00744,"regions":["ams2","ams3","blr1","fra1","lon1","nyc1","nyc2","nyc3","sfo1","sfo3","sgp1","tor1"],"available":true,"description":"Basic"},"size_slug":"s-1vcpu-1gb","networks":{"v4":[],"v6":[]},"region":{"name":"New York 1","slug":"nyc1","features":["backups","ipv6","metadata","install_agent","storage","image_transfer"],"available":true,"sizes":["s-1vcpu-1gb","s-1vcpu-1gb-intel","s-1vcpu-2gb","s-1vcpu-2gb-intel","s-2vcpu-2gb","s-2vcpu-2gb-intel","s-2vcpu-4gb","s-2vcpu-4gb-intel","s-4vcpu-8gb","c-2","c2-2vcpu-4gb","s-4vcpu-8gb-intel","g-2vcpu-8gb","gd-2vcpu-8gb","s-8vcpu-16gb","m-2vcpu-16gb","c-4","c2-4vcpu-8gb","s-8vcpu-16gb-intel","m3-2vcpu-16gb","g-4vcpu-16gb","so-2vcpu-16gb","m6-2vcpu-16gb","gd-4vcpu-16gb","so1_5-2vcpu-16gb","m-4vcpu-32gb","c-8","c2-8vcpu-16gb","m3-4vcpu-32gb","g-8vcpu-32gb","so-4vcpu-32gb","m6-4vcpu-32gb","gd-8vcpu-32gb","so1_5-4vcpu-32gb","m-8vcpu-64gb","c-16","c2-16vcpu-32gb","m3-8vcpu-64gb","g-16vcpu-64gb","so-8vcpu-64gb","m6-8vcpu-64gb","gd-16vcpu-64gb","so1_5-8vcpu-64gb","m-16vcpu-128gb","c-32","c2-32vcpu-64gb","m3-16vcpu-128gb","m-24vcpu-192gb","g-32vcpu-128gb","so-16vcpu-128gb","m6-16vcpu-128gb","gd-32vcpu-128gb","m3-24vcpu-192gb","g-40vcpu-160gb","so1_5-16vcpu-128gb","m-32vcpu-256gb","gd-40vcpu-160gb","so-24vcpu-192gb","m6-24vcpu-192gb","m3-32vcpu-256gb","so1_5-24vcpu-192gb"]},"tags":["Wget-tutorial"]},"links":{"actions":[{"id":1164336542,"rel":"create","href":"https://api.digitalocean.com/v2/actions/1164336542"}]}}

      If you see an output similar to the one above that means that you have successfully created a Droplet.

      Now let’s get a list of all the Droplets in your account that have the tag Wget-tutorial. Run the following command, replacing your_personal_access_token with your DigitalOcean Personal Access Token:

      • wget -O- -q --header="Content-Type: application/json" --header="Authorization: Bearer your_personal_access_token" https://api.digitalocean.com/v2/droplets?tag_name=Wget-tutorial

      You should see the name of the Droplet you have just created in the output:

      Output

      {"droplets":[{"id":237171073,"name":"Wget-example","memory":1024,"vcpus":1,"disk":25,"locked":false,"status":"active","kernel":null,"created_at":"2021-03-16T12:38:59Z","features":["private_networking"],"backup_ids":[],"next_backup_window":null,"snapshot_ids":[],"image":{"id":72067660,"name":"20.04 (LTS) x64","distribution":"Ubuntu","slug":"ubuntu-20-04-x64","public":true,"regions":["nyc3","nyc1","sfo1","nyc2","ams2","sgp1","lon1","ams3","fra1","tor1","sfo2","blr1","sfo3"],"created_at":"2020-10-20T16:34:30Z","min_disk_size":15,"type":"base","size_gigabytes":0.52,"description":"Ubuntu 20.04 x86","tags":[],"status":"available"},"volume_ids":[],"size":{"slug":"s-1vcpu-1gb","memory":1024,"vcpus":1,"disk":25,"transfer":1.0,"price_monthly":5.0,"price_hourly":0.00744,"regions":["ams2","ams3","blr1","fra1","lon1","nyc1","nyc2","nyc3","sfo1","sfo3","sgp1","tor1"],"available":true,"description":"Basic"},"size_slug":"s-1vcpu-1gb","networks":{"v4":[{"ip_address":"10.116.0.2","netmask":"255.255.240.0","gateway":"","type":"private"},{"ip_address":"204.48.20.197","netmask":"255.255.240.0","gateway":"204.48.16.1","type":"public"}],"v6":[]},"region":{"name":"New York 1","slug":"nyc1","features":["backups","ipv6","metadata","install_agent","storage","image_transfer"],"available":true,"sizes":["s-1vcpu-1gb","s-1vcpu-1gb-intel","s-1vcpu-2gb","s-1vcpu-2gb-intel","s-2vcpu-2gb","s-2vcpu-2gb-intel","s-2vcpu-4gb","s-2vcpu-4gb-intel","s-4vcpu-8gb","c-2","c2-2vcpu-4gb","s-4vcpu-8gb-intel","g-2vcpu-8gb","gd-2vcpu-8gb","s-8vcpu-16gb","m-2vcpu-16gb","c-4","c2-4vcpu-8gb","s-8vcpu-16gb-intel","m3-2vcpu-16gb","g-4vcpu-16gb","so-2vcpu-16gb","m6-2vcpu-16gb","gd-4vcpu-16gb","so1_5-2vcpu-16gb","m-4vcpu-32gb","c-8","c2-8vcpu-16gb","m3-4vcpu-32gb","g-8vcpu-32gb","so-4vcpu-32gb","m6-4vcpu-32gb","gd-8vcpu-32gb","so1_5-4vcpu-32gb","m-8vcpu-64gb","c-16","c2-16vcpu-32gb","m3-8vcpu-64gb","g-16vcpu-64gb","so-8vcpu-64gb","m6-8vcpu-64gb","gd-16vcpu-64gb","so1_5-8vcpu-64gb","m-16vcpu-128gb","c-32","c2-32vcpu-64gb","m3-16vcpu-128gb","m-24vcpu-192gb","g-32vcpu-128gb","so-16vcpu-128gb","m6-16vcpu-128gb","gd-32vcpu-128gb","m3-24vcpu-192gb","g-40vcpu-160gb","so1_5-16vcpu-128gb","m-32vcpu-256gb","gd-40vcpu-160gb","so-24vcpu-192gb","m6-24vcpu-192gb","m3-32vcpu-256gb","so1_5-24vcpu-192gb"]},"tags":["Wget-tutorial"],"vpc_uuid":"5ee0a168-39d1-4c60-a89c-0b47390f3f7e"}],"links":{},"meta":{"total":1}}

      Now let’s take the id of the Droplet you have created and use it to delete the Droplet. Run the following command, replacing your_personal_access_token with your DigitalOcean Personal Access Token and your_droplet_id with your Droplet id:

      • wget --method=delete -O- --header="Content-Type: application/json" --header="Authorization: Bearer your_personal_access_token" https://api.digitalocean.com/v2/droplets/your_droplet_id

      In the command above, you added your Droplet id to the URL to delete it. If you are seeing a 204 No Content in the output, that means that you succeeded in deleting the Droplet.

      In this section, you used Wget to send multiple headers. Then, you created and managed a Droplet in your DigitalOcean account.

      Conclusion

      In this tutorial, you used Wget to download files in stable and unstable network conditions and interact with REST API endpoints. You then used this knowledge to create and manage a Droplet in your DigitalOcean account. If you would like to learn more about Wget, visit this tool’s manual page. For more Linux command-line tutorials visit DigitalOcean community tutorials.



      Source link