🥳Join the Scrapeless Community and Claim Your Free Trial to Access Our Powerful Web Scraping Toolkit!
Back to Blog

Getting Started with Axios in Node.js: A Beginner’s Guide

Alex Johnson
Alex Johnson

Senior Web Scraping Engineer

07-Nov-2024

All we known, handling HTTP requests efficiently is crucial, and one tool that makes this easy is Axios. Whether you're retrieving data from APIs, sending data to servers, or simply scraping content from websites, Axios is a great tool to streamline the process.

Let’s dive into what Axios is, how it works, and explore some practical ways to use it effectively in Node.js.

What is Axios in Node.js?

Axios is a promise-based HTTP client designed for both client-side and server-side JavaScript environments. With Axios, you can make HTTP requests from the browser in frontend applications and from the server in Node.js applications. Axios automatically transforms JSON data and handles many of the complexities of request-response handling, making it simple to work with APIs and handle data.

Axios was originally built as a browser-based HTTP client but has become incredibly popular on the server side, especially with Node.js. With its simple syntax and extensive customization options, Axios has become one of the go-to libraries for developers around the world.

Key Features of Axios:

  1. Promise-Based: Since Axios is built on Promises, it allows for cleaner, more manageable code using async/await syntax. This approach makes it easy to write asynchronous code without the callback hell, making your code more readable and maintainable.

  2. Automatic Data Transformation: Axios automatically transforms JSON data, simplifying the process of sending and receiving data in JSON format. When it sends a request or receives a response in JSON format, it parses and serializes the data automatically, allowing you to work directly with JavaScript objects.

  3. Client and Server-Side Compatibility: Axios works both on the client side (in the browser) and server side (in Node.js), making it versatile for both frontend and backend development. When used in Node.js, Axios simplifies making HTTP requests to APIs, performing server-to-server data fetching, and more.

  4. Built-in Interceptors: Axios supports request and response interceptors, which are useful for handling tasks like logging, error handling, authentication, or adding headers before requests are sent or responses are processed.

  5. Customizable Configuration: Axios allows for extensive customization. For example, you can set global defaults such as headers or base URLs, making it easy to configure for different environments.

  6. Error Handling: Axios provides a robust error-handling mechanism that lets you catch errors based on the status code returned. You can handle client and server errors differently, making it easier to manage response handling.

Common Use Cases for Axios in Node.js

  • Fetching Data from APIs: Axios is commonly used in Node.js to fetch data from APIs, making it ideal for building RESTful applications, especially when you need to interact with third-party services or microservices.

  • Web Scraping: Axios can be used to fetch HTML content from websites when web scraping, especially for static sites. Combined with tools like Cheerio, it enables parsing HTML to extract relevant data.

  • Handling Authentication: Axios is frequently used in Node.js to interact with authentication services, such as sending credentials to an API for authentication and handling tokens.

  • File Uploads/Downloads: It’s capable of handling binary data as well, so you can use it to upload or download files from a server.

Having trouble with web scraping challenges and constant blocks on the projects you are working on?
Try to use Scrapeless to make data extraction easy and efficient, all in one powerful tool. Try it free today!

Example Use in Node.js

Below is an example of using Axios in Node.js to make a GET request:

javascript Copy
const axios = require('axios');

async function fetchData() {
  try {
    const response = await axios.get('https://jsonplaceholder.typicode.com/posts');
    console.log(response.data);
  } catch (error) {
    console.error('Error fetching data:', error.message);
  }
}

fetchData();

This code demonstrates the simplicity of making requests with Axios in Node.js. The response.data contains the actual data from the API, while error.message handles any error that might occur, offering a streamlined approach to request handling.

Is Axios Server-Side or Client-Side?

Axios is a versatile HTTP client that works on both server and client environments. Here’s how it fits into each:

  • Server-Side (Node.js): In Node.js, Axios can make HTTP requests from the backend server. This feature is commonly used for server-to-server communication, data retrieval from third-party APIs, and scraping websites.
  • Client-Side (Browser): In frontend applications, Axios handles HTTP requests, such as retrieving and posting data to/from APIs.

By using Axios, you get a unified API to handle both client and server HTTP requests, making code more reusable and reducing the need to learn different request methods for frontend and backend.

Is Axios a Node Module?

Yes, Axios is a Node module. It’s a standalone library, meaning you can install it easily via npm or yarn in any Node.js project. It’s not bundled with Node.js by default but can be integrated into any project by installing it as a dependency.

To install Axios, you can use the following command in your terminal:

bash Copy
npm install axios

After installing, you can import Axios at the top of your script with:

javascript Copy
const axios = require('axios');

This module is lightweight, fast, and designed to handle HTTP requests without adding much complexity to your project.

How to Install Axios in Terminal?

To install Axios in your Node.js environment, open your terminal and navigate to your project’s root directory. Run the following command:

bash Copy
npm install axios

Or if you prefer Yarn, you can use:

bash Copy
yarn add axios

Once installed, you can immediately use it in your project by requiring or importing it, as shown in the previous section. This installation will add Axios as a dependency to your project, allowing you to use its methods to handle HTTP requests efficiently.

How to Use Axios in Web Scraping

To effectively use Axios in web scraping, follow these steps for a complete guide, from setting up Axios to handling challenges and parsing the data. Here’s a breakdown of how to use Axios in web scraping:

1. Setting Up Axios for Web Scraping

First, ensure you have Node.js and Axios installed in your project. You can install Axios by running:

bash Copy
npm install axios

With Axios installed, it’s ready to use for making HTTP requests to websites.

2. Basic Usage for Fetching Web Page Content

When web scraping, the main goal is to fetch the HTML content of a webpage. Here’s how you can make a simple GET request to a website using Axios:

javascript Copy
const axios = require('axios');

async function fetchHTML(url) {
  try {
    const response = await axios.get(url);
    return response.data; // HTML content of the page
  } catch (error) {
    console.error(`Error fetching HTML: ${error.message}`);
  }
}

fetchHTML('https://example.com').then(html => console.log(html));

In this code, fetchHTML sends a GET request to the provided URL and returns the HTML content. This is the foundation for scraping data from static websites.

3. Parsing HTML Content

To extract specific information from the HTML, use a parsing library like Cheerio. Cheerio allows you to query HTML using jQuery-like syntax, making it easy to target elements. Install it by running:

bash Copy
npm install cheerio

Here’s how you might use Cheerio alongside Axios to scrape data from a webpage:

javascript Copy
const axios = require('axios');
const cheerio = require('cheerio');

async function scrapeWebsite(url) {
  try {
    const { data } = await axios.get(url);
    const $ = cheerio.load(data);

    // Example: Extract all article titles
    const titles = [];
    $('h2.article-title').each((i, element) => {
      titles.push($(element).text());
    });

    console.log('Scraped Titles:', titles);
  } catch (error) {
    console.error(`Error scraping website: ${error.message}`);
  }
}

scrapeWebsite('https://example.com/articles');

In this example, Axios retrieves the HTML, and Cheerio parses it to extract article titles (h2.article-title in this case). You can adjust the selectors to target different elements on the page.

4. Handling Headers and User Agents

Some websites block requests from non-browser clients. To mimic a real browser, include headers like User-Agent in your Axios request. Here’s how to set up headers:

javascript Copy
const axios = require('axios');

async function fetchWithHeaders(url) {
  try {
    const response = await axios.get(url, {
      headers: {
        'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.82 Safari/537.36'
      }
    });
    return response.data;
  } catch (error) {
    console.error(`Error with headers: ${error.message}`);
  }
}

fetchWithHeaders('https://example.com').then(html => console.log(html));

Adding headers can help prevent the site from blocking your request. The User-Agent header makes your request appear to come from a real browser.

5. Dealing with JavaScript-Rendered Pages

Axios is ideal for static sites, but it can’t execute JavaScript. If you’re scraping JavaScript-heavy sites, use Puppeteer or Playwright to render the page fully. For Axios-only solutions, focus on static content or look for alternative data sources like APIs.

6. Error Handling and Rate Limiting

Error handling is essential in web scraping to manage issues like rate limits or blocked requests. Here’s an example with a basic retry mechanism:

javascript Copy
const axios = require('axios');

async function fetchWithRetries(url, retries = 3) {
  for (let i = 0; i < retries; i++) {
    try {
      const response = await axios.get(url);
      return response.data;
    } catch (error) {
      console.error(`Attempt ${i + 1} failed: ${error.message}`);
      if (i === retries - 1) throw error;
    }
  }
}

fetchWithRetries('https://example.com').then(html => console.log(html));

7. Storing or Processing the Data

Once you have the data, save it to a file or database for further analysis. For example, you can use fs to save data to a JSON file:

javascript Copy
const fs = require('fs');
const axios = require('axios');
const cheerio = require('cheerio');

async function scrapeAndSave(url) {
  try {
    const { data } = await axios.get(url);
    const $ = cheerio.load(data);
    const results = [];

    // Scrape specific data
    $('h2.article-title').each((i, element) => {
      results.push($(element).text());
    });

    fs.writeFileSync('data.json', JSON.stringify(results, null, 2));
    console.log('Data saved to data.json');
  } catch (error) {
    console.error(`Error saving data: ${error.message}`);
  }
}

scrapeAndSave('https://example.com/articles');

In this example, the scraped data is saved to data.json, which you can later load for analysis.

8. Handling Common Challenges

  • Bot Detection: Adding headers and handling cookies can make requests appear more legitimate.
  • Rate Limiting: Use delays between requests or implement a retry mechanism to avoid overwhelming the server.
  • Blocked IPs: Rotate IPs or use proxy services to avoid getting blocked when scraping sites with aggressive anti-scraping measures.

Conclusion

In this guide, we’ve covered the essentials of using Axios in Node.js for web scraping. You learned about Axios’s role as a Node.js module, how to install and set it up, and explored practical techniques like adding headers, using Cheerio for parsing HTML, and handling common web scraping challenges like bot detection and error management.

For developers working with API integrations, web scraping, or data collection, Axios offers a powerful, flexible, and efficient approach to making HTTP requests and handling responses. While ideal for static content, pairing Axios with tools like Puppeteer or Playwright can help overcome limitations with dynamic content.

With these strategies, you’re equipped to use Axios effectively, making it a versatile addition to any developer’s toolkit. Experiment with different configurations to optimize your API calls and web scraping projects for reliability and efficiency.

At Scrapeless, we only access publicly available data while strictly complying with applicable laws, regulations, and website privacy policies. The content in this blog is for demonstration purposes only and does not involve any illegal or infringing activities. We make no guarantees and disclaim all liability for the use of information from this blog or third-party links. Before engaging in any scraping activities, consult your legal advisor and review the target website's terms of service or obtain the necessary permissions.

Most Popular Articles

Catalogue