Medium Followers Scraper with JavaScript and MySQL
Medium Followers Scraper with JavaScript and MySQL
In the digital age, data is a powerful asset. For content creators and marketers, understanding audience engagement on platforms like Medium can provide valuable insights. One way to gather such data is by scraping follower information. This article explores how to build a Medium followers scraper using JavaScript and MySQL, providing a step-by-step guide to help you get started.
Understanding Web Scraping
Web scraping is the process of extracting data from websites. It involves fetching the HTML of a webpage and parsing it to extract the desired information. While web scraping can be incredibly useful, it’s important to adhere to legal and ethical guidelines, such as respecting a website’s terms of service and robots.txt file.
In the context of Medium, scraping follower data can help you analyze your audience, track growth, and tailor your content strategy. However, always ensure you have permission to scrape data and use it responsibly.
Setting Up the Environment
Before diving into the code, you’ll need to set up your development environment. This involves installing Node.js, a JavaScript runtime, and MySQL, a popular relational database management system. Node.js will allow you to run JavaScript on the server side, while MySQL will store the scraped data.
To install Node.js, visit the official website and download the installer for your operating system. For MySQL, you can use a package manager like Homebrew on macOS or download the installer from the MySQL website. Once installed, ensure both Node.js and MySQL are added to your system’s PATH.
Building the Scraper with JavaScript
With the environment set up, you can start building the scraper. We’ll use the Axios library to make HTTP requests and Cheerio to parse the HTML. First, create a new Node.js project and install the necessary packages:
npm init -y npm install axios cheerio
Next, create a JavaScript file, e.g., `scraper.js`, and import the required modules:
const axios = require('axios'); const cheerio = require('cheerio');
Now, write a function to fetch the Medium profile page and extract follower data:
async function scrapeFollowers(username) { try { const response = await axios.get(`https://medium.com/@${username}`); const $ = cheerio.load(response.data); const followers = $('meta[name="description"]').attr('content').match(/(d+) Followers/)[1]; console.log(`User ${username} has ${followers} followers.`); } catch (error) { console.error('Error fetching data:', error); } } scrapeFollowers('exampleUser');
This function fetches the Medium profile page for a given username, parses the HTML to find the number of followers, and logs it to the console.
Storing Data in MySQL
To store the scraped data, you’ll need to set up a MySQL database. Start by creating a new database and table to hold the follower information:
CREATE DATABASE medium_scraper; USE medium_scraper; CREATE TABLE followers ( id INT AUTO_INCREMENT PRIMARY KEY, username VARCHAR(255) NOT NULL, follower_count INT NOT NULL, scraped_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP );
Next, install the MySQL package for Node.js to interact with the database:
npm install mysql
Modify your `scraper.js` file to insert the scraped data into the database:
const mysql = require('mysql'); const connection = mysql.createConnection({ host: 'localhost', user: 'root', password: 'yourpassword', database: 'medium_scraper' }); connection.connect(); async function scrapeFollowers(username) { try { const response = await axios.get(`https://medium.com/@${username}`); const $ = cheerio.load(response.data); const followers = $('meta[name="description"]').attr('content').match(/(d+) Followers/)[1]; const query = 'INSERT INTO followers (username, follower_count) VALUES (?, ?)'; connection.query(query, [username, followers], (error, results) => { if (error) throw error; console.log(`Inserted data for ${username}: ${followers} followers.`); }); } catch (error) { console.error('Error fetching data:', error); } } scrapeFollowers('exampleUser');
This code connects to the MySQL database and inserts the scraped follower data into the `followers` table.
Conclusion
Building a Medium followers scraper with JavaScript and MySQL is a practical way to gather insights about your audience. By following this guide, you can set up a basic scraper that fetches follower data and stores it in a database for further analysis. Remember to use web scraping responsibly and respect the terms of service of the websites you interact with. With this foundation, you can expand your scraper to gather more data and refine your content strategy based on real audience insights.
Responses