I Built a Personal Job Hunting Assistant That Runs on My Raspberry Pi
The Idea
Job hunting as a fresh grad CS student is a lot of work. Not the interview prep or the portfolio building, that part makes sense. The tedious part is the daily browsing. Opening LinkedIn, searching the same keywords, switching to Indeed, searching again, and trying to mentally filter through listings that are not really relevant to where you are in your career.
I wanted something that could handle that part for me. A personal assistant that runs quietly in the background, goes through the listings every morning, figures out which ones actually match my profile, and just tells me the ones worth looking at.
I had a Raspberry Pi sitting on my desk already running a local LLM. The pieces were kind of already there. So I put them together and built it.
What the Bot Actually Does
The system is pretty simple. It scrapes LinkedIn and Indeed on a schedule for roles matching my criteria (software engineer, Bangkok, entry-level or internship). Then it sends each listing to LLaMA via API to score how well it fits my profile. Finally it sends me a Telegram message with the top matches including the job title, company, a one-line summary, and a fit score.
No dashboard, no database admin panel. Just a clean Telegram notification with the jobs worth looking at, delivered automatically every morning.
The Stack
- Raspberry Pi 4 as the always-on host
- Python for scraping, orchestration, and Telegram integration
- LLaMA via API for job scoring and summarisation
- python-telegram-bot for message delivery
- cron to trigger everything on a schedule
Scraping LinkedIn and Indeed
Both platforms actively resist scraping so this part required some care. For LinkedIn I used requests with rotating user-agent headers and added deliberate delays between requests to avoid triggering rate limits. The job search URL accepts query parameters for keywords, location, and date posted so it was straightforward to construct a targeted URL and parse the HTML with BeautifulSoup.
import requests
from bs4 import BeautifulSoup
import time, random
def scrape_linkedin(keyword, location):
url = f"https://www.linkedin.com/jobs/search/?keywords={keyword}&location={location}&f_TPR=r86400"
headers = {"User-Agent": "Mozilla/5.0 ..."}
response = requests.get(url, headers=headers)
soup = BeautifulSoup(response.text, "html.parser")
jobs = []
for card in soup.select(".job-search-card"):
title = card.select_one(".base-search-card__title")
company = card.select_one(".base-search-card__subtitle")
link = card.select_one("a")
if title and company and link:
jobs.append({
"title": title.text.strip(),
"company": company.text.strip(),
"url": link["href"],
})
time.sleep(random.uniform(1.5, 3.0))
return jobs
Indeed followed a similar pattern. Parse the search results page, extract the title, company, and URL from each listing card.
Scoring with LLaMA
Raw listings are noisy. A lot of "Software Engineer" roles are actually looking for five plus years of experience or a stack I have never touched. I did not want those cluttering my Telegram feed so every listing gets sent to LLaMA with a simple prompt.
def score_job(job, my_profile):
prompt = f"""
You are a career assistant. Given a job listing and a candidate profile, return a JSON object with:
- "score": integer from 1-10 (fit score)
- "summary": one sentence explaining why this role is or isn't a good match
Job Title: {job['title']}
Company: {job['company']}
Candidate Profile:
{my_profile}
Return only valid JSON. No extra text.
"""
response = call_llama_api(prompt)
return parse_json_response(response)
The my_profile variable is a short plaintext description of my stack, graduation status, and what I am looking for. LLaMA returns a score and a one-line reason. Anything below a 6 gets dropped before it even reaches Telegram.
The Telegram Notification
This is honestly the best part. After scoring, the top listings get formatted and sent as a Telegram message.
from telegram import Bot
async def send_results(jobs):
bot = Bot(token=TELEGRAM_BOT_TOKEN)
message = "🤖 *Today's Job Matches*\n\n"
for job in jobs:
message += f"*{job['title']}* @ {job['company']}\n"
message += f"Score: {job['score']}/10 — {job['summary']}\n"
message += f"[View listing]({job['url']})\n\n"
await bot.send_message(
chat_id=CHAT_ID,
text=message,
parse_mode="Markdown"
)
Every morning I wake up to a neat list of pre-filtered, LLM-scored job listings in my personal Telegram. No tab switching, no duplicates, just the ones worth my attention.

Lessons Learned
LLM scoring is surprisingly reliable for this use case. I expected it to be a fun gimmick but the assessments were genuinely useful. It caught mismatches I would have caught myself, just faster and before I had already clicked through three pages.
Scraping is a maintenance burden though. Both LinkedIn and Indeed change their HTML structure from time to time and that breaks the selectors. I have had to update the scraper twice already. If I rebuild this I would look at an unofficial jobs API or a scraping service to absorb that overhead.
Smaller scope also shipped faster. My original idea had a full scoring rubric, cover letter generation, and auto-apply functionality. I cut all of it and shipped the minimal version first: scrape, score, notify. That version runs reliably. The fancier ideas are still sitting on a list somewhere.
Running it on the Pi was the right call too. It is always on, costs nothing extra to run, and it keeps the project grounded as a tool I actually use rather than a side project deployed to the cloud and forgotten about.
What I Would Do Differently
If I started over I would add a simple SQLite database to track which listings I have already seen so the bot only surfaces genuinely new results. Right now there is some overlap between runs. I would also add a quick feedback mechanism, a Telegram button to mark a listing as applied or not interested, so the scoring prompt can be refined over time based on my actual decisions.
Final Thought
I am still job hunting as I write this. But somewhere along the way, building the assistant became just as valuable as what it was finding for me. It reminded me that I actually enjoy solving problems, even when the problem is my own situation.
