A collection of tools for use with writing SEO-related programs in Python.
Project description
SEOtools 🛠️
A set of utilities for SEOs and web developers with which to complete common tasks.
With SEOtools, you can:
- Programatically add links to related posts in content.
- Calculate PageRank on internal links from your sitemap.
- Identify broken links on a web page.
- Recommend a post to use as canonical for a given keyword.
- Find the distance of pages from your home page.
And more!
Installation 💻
You can install SEOtools using pip:
pip install seotools
Quickstart 🚀
Create a link graph
from seotools.app import Analyzer
# load from disk will load the link graph from disk if it has already been created
# otherwise, a new link graph will be created
analyzer = Analyzer("https://jamesg.blog/sitemap.xml", load_from_disk=True)
Get pagerank of a URL
print(analyzer.pagerank["https://jamesg.blog"])
Add relevant internal links to a web page
import markdown
article = markdown.markdown(BeautifulSoup(article.text, "html.parser").get_text())
keyword_replace_count = 0
for keyword, url in keyword_map.items():
if keyword_replace_count >= MAX_KEYWORD_REPLACE:
break
article = article.replace(keyword, f"<a href='{url}'>{keyword}</a>", 1)
keyword_replace_count += 1
print(article)
Recommend related content for a "See Also" section
article = requests.get("https://jamesg.blog/...")
article = markdown.markdown(BeautifulSoup(article.text, "html.parser").get_text())
urls = analyzer.recommend_related_content(article.text)
Check if a page contains a particular JSON-LD object
from seotools import page_contains_jsonld
import requests
content = requests.get("https://jamesg.blog")
print(page_contains_jsonld(content, "FAQPage"))
Get subfolders in a sitemap
analyzer.get_subpaths()
Get distance of URL from home page
analyzer.get_distance_from_home_page("https://jamesg.blog/2023/01/01/")
Retrieve keywords that appear more than N times on a web page
from seotools import get_keywords
import requests
from bs4 import BeautifulSoup
article = requests.get("https://jamesg.blog/...").text
parsed_article = BeautifulSoup(article, "html.parser").get_text()
# get keywords that appear more than 10 times
keywords = get_keywords(parsed_article, 10)
See Also 📚
- getsitemap: Retrieve URLs in a sitemap. (Web interface)
License 📝
This project is licensed under an MIT license.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
seotools-0.1.2.tar.gz
(16.3 kB
view hashes)
Built Distribution
seotools-0.1.2-py3-none-any.whl
(12.7 kB
view hashes)