← Back to Use Cases
SCHEDULED JOBS & PIPELINES

Cron Memories

Store state between cron runs. Know what happened last time without spinning up Postgres.

The Problem

You have a cron job that runs every hour to check for new items in an API. You need to remember the last timestamp you checked so you don't process duplicates.

Setting up a database for this one piece of state feels like overkill. You just need a place to store "last checked at X".

The WrenDB Solution

Create a stash once, then read/update it in your cron script. No database, no state file on disk, just HTTP.

Step-by-Step Guide

1

Set up your stash (one-time)

First, create a stash to hold your cron state:

Terminal
# Create your stash
curl -X POST https://wrendb.com/api/stash

{
  "stash_id": "stash-abc123",
  "master_token": "token-xyz789",
  "message": "Save this token securely..."
}

# Store these in your environment
export WREN_STASH_ID="stash-abc123"
export WREN_TOKEN="token-xyz789"
2

Create your cron script

Here's a complete bash script that remembers the last run:

check_api.sh
#!/bin/bash

STASH_ID="$WREN_STASH_ID"
TOKEN="$WREN_TOKEN"

# Try to get the last checked timestamp
LAST_CHECKED=$(curl -s https://wrendb.com/api/item/$STASH_ID/last-check)

# If empty, this is the first run - use a default
if [ -z "$LAST_CHECKED" ]; then
  LAST_CHECKED="2024-01-01T00:00:00Z"
  echo "First run! Using default timestamp: $LAST_CHECKED"
fi

echo "Last checked: $LAST_CHECKED"

# Fetch new items from your API since the last check
NEW_ITEMS=$(curl -s "https://your-api.com/items?since=$LAST_CHECKED")

# Process the items...
echo "Processing new items..."
echo "$NEW_ITEMS" | jq '.items[] | .id'

# Update the timestamp for next time
CURRENT_TIME=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
curl -X PUT https://wrendb.com/api/item/$STASH_ID/last-check \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: text/plain" \
  -d "$CURRENT_TIME"

echo "Updated last-check to: $CURRENT_TIME"
3

Schedule it

Add to your crontab to run every hour:

crontab
# Run every hour
0 * * * * /path/to/check_api.sh >> /var/log/cron.log 2>&1

Python Version

check_api.py
import os
import requests
from datetime import datetime, timezone

STASH_ID = os.environ["WREN_STASH_ID"]
TOKEN = os.environ["WREN_TOKEN"]
BASE_URL = f"https://wrendb.com/api/item/{STASH_ID}"

def get_last_check():
    """Get the last checked timestamp from WrenDB"""
    response = requests.get(f"{BASE_URL}/last-check")
    if response.status_code == 200:
        return response.text
    return "2024-01-01T00:00:00Z"  # Default for first run

def update_last_check(timestamp):
    """Save the new timestamp to WrenDB"""
    requests.put(
        f"{BASE_URL}/last-check",
        headers={"Authorization": f"Bearer {TOKEN}"},
        data=timestamp
    )

def main():
    # Get last check time
    last_checked = get_last_check()
    print(f"Last checked: {last_checked}")

    # Fetch new items from your API
    response = requests.get(
        f"https://your-api.com/items?since={last_checked}"
    )
    new_items = response.json()

    # Process items...
    print(f"Found {len(new_items['items'])} new items")
    for item in new_items['items']:
        print(f"Processing: {item['id']}")

    # Update timestamp
    current_time = datetime.now(timezone.utc).isoformat()
    update_last_check(current_time)
    print(f"Updated last-check to: {current_time}")

if __name__ == "__main__":
    main()

Why This Works

  • No database setup - Just environment variables and HTTP calls
  • Stateless cron - Your script can run anywhere without mounting volumes
  • Easy debugging - Just curl the endpoint to see the last state
  • Auto-cleanup - If the cron stops running, the data expires automatically

Related Use Cases

CI/CD State - Pass data between pipeline stages Build Badge Data - Store pass/fail status for badges