Skip to content

SEOtools 🛠️

A set of utilities for SEOs and web developers with which to complete common tasks.

With SEOtools, you can:

  1. Programatically add links to related posts in content.
  2. Calculate PageRank on internal links from your sitemap.
  3. Identify broken links on a web page.
  4. Recommend a post to use as canonical for a given keyword.
  5. Find the distance of pages from your home page.

And more!

Installation 💻

You can install SEOtools using pip:

pip install seotools

Quickstart 🚀

from seotools.app import Analyzer

analyzer = Analyzer("https://jamesg.blog/sitemap.xml")

analyzer.create_link_graph(10, 20)
analyzer.compute_pagerank()
analyzer.embed_headings()

Get pagerank of a URL

print(analyzer.pagerank["https://jamesg.blog"])
import markdown

article = markdown.markdown(BeautifulSoup(article.text, "html.parser").get_text())

keyword_replace_count = 0

for keyword, url in keyword_map.items():
    if keyword_replace_count >= MAX_KEYWORD_REPLACE:
        break

    article = article.replace(keyword, f"<a href='{url}'>{keyword}</a>", 1)
    keyword_replace_count += 1

print(article)
article = requests.get("https://jamesg.blog/...")

article = markdown.markdown(BeautifulSoup(article.text, "html.parser").get_text())

urls = analyzer.recommend_related_content(article.text)

Check if a page contains a particular JSON-LD object

from seotools import page_contains_jsonld
import requests

content = requests.get("https://jamesg.blog")

print(page_contains_jsonld(content, "FAQPage"))

Get subfolders in a sitemap

analyzer.get_subpaths()

Get distance of URL from home page

analyzer.get_distance_from_home_page("https://jamesg.blog/2023/01/01/")

Retrieve keywords that appear more than N times on a web page

from seotools import get_keywords
import requests
from bs4 import BeautifulSoup

article = requests.get("https://jamesg.blog/...").text
parsed_article = BeautifulSoup(article, "html.parser").get_text()

# get keywords that appear more than 10 times
keywords = get_keywords(parsed_article, 10)

See Also 📚

License 📝

This project is licensed under an MIT license.