Compare commits

...

10 commits

Author SHA1 Message Date
Patrick Mulligan
5e95b309fe make format
Some checks failed
CI / lint (push) Has been cancelled
CI / tests (push) Has been cancelled
/ release (push) Has been cancelled
/ pullrequest (push) Has been cancelled
2025-11-17 00:01:21 +01:00
Patrick Mulligan
8547864254 docs: Update README with complete NIP-09 deletion support
Updated NIP-09 section to document full implementation including:
  - 'e' tags for deleting regular events by event ID
  - 'a' tags for deleting addressable events by address format (kind:pubkey:d-identifier)

  This reflects the implementation added in commits 3ba3318 and 538fe42 which brought the relay into full NIP-09 compliance.
2025-11-17 00:01:21 +01:00
Patrick Mulligan
dcc3204735 Fix NIP-09 deletion for parameterized replaceable events (NIP-33)
Fixed bug where deleting a parameterized replaceable event (e.g., kind 31922)
using an 'a' tag would incorrectly delete ALL events of that kind instead of
just the specific event with the matching d-tag.

**Root Cause:**
NostrFilter's 'd' field uses a Pydantic Field alias "#d". When creating a filter
with `NostrFilter(d=[value])`, Pydantic ignores it because the parameter name
doesn't match the alias.

**Fix:**
Changed filter creation to use the alias:
```python
NostrFilter(authors=[...], kinds=[...], **{"#d": [d_tag]})
```

**Testing:**
- Created two tasks with different d-tags
- Deleted only one task
- Verified only the specified task was marked as deleted in the database
- Confirmed the other task remained unaffected

This ensures proper NIP-09 deletion behavior for NIP-33 parameterized
replaceable events using 'a' tag format (kind:pubkey:d-identifier).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-17 00:01:21 +01:00
Patrick Mulligan
8bfd792548 Add NIP-09 support for parameterized replaceable events (NIP-33)
Extended NIP-09 deletion event handling to support both regular events
and parameterized replaceable events (NIP-33).

**Previous behavior:**
- Only handled 'e' tags (regular event IDs)
- Did not support 'a' tags for addressable/replaceable events

**New behavior:**
- Handles both 'e' tags (event IDs) and 'a' tags (event addresses)
- Parses 'a' tag format: kind:pubkey:d-identifier
- Validates deletion author matches event address pubkey (NIP-09 requirement)
- Creates appropriate filters for each deletion type

**Implementation:**
- Added parsing for 'a' tag event addresses
- Extract kind, pubkey, and d-tag from address format
- Build NostrFilter with authors, kinds, and d-tag parameters
- Collect all event IDs to delete from both 'e' and 'a' tags
- Mark matching events as deleted in single operation

This enables proper deletion of parameterized replaceable events like
calendar events (kind 31922-31924), long-form content (kind 30023),
and other addressable event kinds.

Implements NIP-09: https://github.com/nostr-protocol/nips/blob/master/09.md
Supports NIP-33: https://github.com/nostr-protocol/nips/blob/master/33.md
2025-11-17 00:01:21 +01:00
Vlad Stan
c4efb87b70
chore: bump version (#41) 2025-11-13 12:44:22 +02:00
dni ⚡
a53d2d7767
refactor: get rid of secp lib (#40) 2025-11-04 10:30:43 +02:00
dni ⚡
35584a230f
chore: add uv, linting, fixes (#39)
* chore: add uv, linting, fixes
2025-10-30 10:43:27 +01:00
PatMulligan
15079c3e58
fix(nostrrelay): use schema-qualified table name in delete_events (#38) 2025-10-27 09:54:29 +02:00
PatMulligan
22df5868de
FEAT: Implement NIP-01 Addressable Events Support (#33)
* Implement NIP-16 parameterized replaceable events

Add support for parameterized replaceable events (kinds 30000-39999) to
properly
handle Nostr marketplace product and stall updates according to NIP-16
specification.

Changes:
- Add is_parameterized_replaceable_event property to NostrEvent
- Implement automatic deletion of previous versions when new
parameterized replaceable event is received
- Add 'd' tag filtering support to NostrFilter for parameterized
replacement logic
- Update SQL query generation to handle 'd' tag joins

Fixes issue where product updates would create duplicate entries instead
of
replacing previous versions, ensuring only the latest version remains
visible.

* Refactor event handling for addressable events

Renamed the property is_parameterized_replaceable_event to is_addressable_event in NostrEvent to align with NIP-01 specifications (previously NIP-16). Updated the client_connection.py to utilize the new property for extracting 'd' tag values for addressable replacement, ensuring proper event handling in the relay system.

* Refactor tag filtering logic in NostrFilter

Updated the tag filtering mechanism to ensure that the filter only fails if the specified tags ('e' and 'p') are not found. This change improves clarity and maintains functionality by allowing for more precise control over event filtering.

* update readme

* Fix addressable event deletion and SQL schema issues

- Fix Pydantic field alias usage for d tag filtering (use #d instead of
d)
- Remove nostrrelay schema prefixes from SQL table references
- Implement subquery approach for DELETE operations with JOINs
- Resolve SQLite DELETE syntax incompatibility with JOIN statements
- Ensure NIP-33 compliance: only delete events with matching d tag
values
2025-09-10 16:40:40 +03:00
PatMulligan
687d7b89c1
Fix REQ message handling to support multiple filter subscriptions (#34)
This fix addresses an issue where REQ messages with multiple filters
were being rejected by the relay. Notably: The nostrmarket extension's
"Refresh from Nostr" functionality sends a single REQ message containing
4 different filter subscriptions:
- Direct Messages (kinds: [4])
- Stalls (kinds: [30017])
- Products (kinds: [30018])
- Profile (kinds: [0])

Changes:
- Changed validation from `len(data) != 3` to `len(data) < 3` to allow
multiple filters
- Added loop to process all filters in a single REQ message (data[2:])
- Accumulate responses from all filters before returning

This ensures compatibility with clients that batch multiple subscription
filters in a single REQ message, which is a valid pattern according to
NIP-01.
2025-09-10 16:35:25 +03:00
19 changed files with 2539 additions and 2774 deletions

View file

@ -11,14 +11,9 @@ jobs:
tests: tests:
runs-on: ubuntu-latest runs-on: ubuntu-latest
needs: [lint] needs: [lint]
strategy:
matrix:
python-version: ['3.9', '3.10']
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: lnbits/lnbits/.github/actions/prepare@dev - uses: lnbits/lnbits/.github/actions/prepare@dev
with:
python-version: ${{ matrix.python-version }}
- name: Run pytest - name: Run pytest
uses: pavelzw/pytest-action@v2 uses: pavelzw/pytest-action@v2
env: env:
@ -30,5 +25,5 @@ jobs:
job-summary: true job-summary: true
emoji: false emoji: false
click-to-expand: true click-to-expand: true
custom-pytest: poetry run pytest custom-pytest: uv run pytest
report-title: 'test (${{ matrix.python-version }})' report-title: 'test'

View file

@ -5,27 +5,27 @@ format: prettier black ruff
check: mypy pyright checkblack checkruff checkprettier check: mypy pyright checkblack checkruff checkprettier
prettier: prettier:
poetry run ./node_modules/.bin/prettier --write . uv run ./node_modules/.bin/prettier --write .
pyright: pyright:
poetry run ./node_modules/.bin/pyright uv run ./node_modules/.bin/pyright
mypy: mypy:
poetry run mypy . uv run mypy .
black: black:
poetry run black . uv run black .
ruff: ruff:
poetry run ruff check . --fix uv run ruff check . --fix
checkruff: checkruff:
poetry run ruff check . uv run ruff check .
checkprettier: checkprettier:
poetry run ./node_modules/.bin/prettier --check . uv run ./node_modules/.bin/prettier --check .
checkblack: checkblack:
poetry run black --check . uv run black --check .
checkeditorconfig: checkeditorconfig:
editorconfig-checker editorconfig-checker
@ -34,15 +34,15 @@ test:
LNBITS_DATA_FOLDER="./tests/data" \ LNBITS_DATA_FOLDER="./tests/data" \
PYTHONUNBUFFERED=1 \ PYTHONUNBUFFERED=1 \
DEBUG=true \ DEBUG=true \
poetry run pytest uv run pytest
install-pre-commit-hook: install-pre-commit-hook:
@echo "Installing pre-commit hook to git" @echo "Installing pre-commit hook to git"
@echo "Uninstall the hook with poetry run pre-commit uninstall" @echo "Uninstall the hook with uv run pre-commit uninstall"
poetry run pre-commit install uv run pre-commit install
pre-commit: pre-commit:
poetry run pre-commit run --all-files uv run pre-commit run --all-files
checkbundle: checkbundle:

View file

@ -15,11 +15,17 @@
## Supported NIPs ## Supported NIPs
- [x] **NIP-01**: Basic protocol flow - [x] **NIP-01**: Basic protocol flow
- [x] Regular Events
- [x] Replaceable Events (kinds 10000-19999)
- [x] Ephemeral Events (kinds 20000-29999)
- [x] Addressable Events (kinds 30000-39999)
- [x] **NIP-02**: Contact List and Petnames - [x] **NIP-02**: Contact List and Petnames
- `kind: 3`: delete past contact lists as soon as the relay receives a new one - `kind: 3`: delete past contact lists as soon as the relay receives a new one
- [x] **NIP-04**: Encrypted Direct Message - [x] **NIP-04**: Encrypted Direct Message
- if `AUTH` enabled: send only to the intended target - if `AUTH` enabled: send only to the intended target
- [x] **NIP-09**: Event Deletion - [x] **NIP-09**: Event Deletion
- [x] 'e' tags: Delete regular events by event ID
- [x] 'a' tags: Delete addressable events by address (kind:pubkey:d-identifier)
- [x] **NIP-11**: Relay Information Document - [x] **NIP-11**: Relay Information Document
- > **Note**: the endpoint is NOT on the root level of the domain. It also includes a path (eg https://lnbits.link/nostrrelay/) - > **Note**: the endpoint is NOT on the root level of the domain. It also includes a path (eg https://lnbits.link/nostrrelay/)
- [ ] **NIP-12**: Generic Tag Queries - [ ] **NIP-12**: Generic Tag Queries
@ -36,8 +42,8 @@
- not planned - not planned
- [x] **NIP-28** Public Chat - [x] **NIP-28** Public Chat
- `kind: 41`: handled similar to `kind 0` metadata events - `kind: 41`: handled similar to `kind 0` metadata events
- [ ] **NIP-33**: Parameterized Replaceable Events - [x] **NIP-33**: Addressable Events (moved to NIP-01)
- todo - ✅ Implemented as part of NIP-01 addressable events
- [ ] **NIP-40**: Expiration Timestamp - [ ] **NIP-40**: Expiration Timestamp
- todo - todo
- [x] **NIP-42**: Authentication of clients to relays - [x] **NIP-42**: Authentication of clients to relays

View file

@ -1,8 +1,9 @@
{ {
"name": "Nostr Relay", "name": "Nostr Relay",
"version": "1.1.0",
"short_description": "One click launch your own relay!", "short_description": "One click launch your own relay!",
"tile": "/nostrrelay/static/image/nostrrelay.png", "tile": "/nostrrelay/static/image/nostrrelay.png",
"min_lnbits_version": "1.0.0", "min_lnbits_version": "1.4.0",
"contributors": [ "contributors": [
{ {
"name": "motorina0", "name": "motorina0",

26
crud.py
View file

@ -1,5 +1,4 @@
import json import json
from typing import Optional
from lnbits.db import Database from lnbits.db import Database
@ -21,7 +20,7 @@ async def update_relay(relay: NostrRelay) -> NostrRelay:
return relay return relay
async def get_relay(user_id: str, relay_id: str) -> Optional[NostrRelay]: async def get_relay(user_id: str, relay_id: str) -> NostrRelay | None:
return await db.fetchone( return await db.fetchone(
"SELECT * FROM nostrrelay.relays WHERE user_id = :user_id AND id = :id", "SELECT * FROM nostrrelay.relays WHERE user_id = :user_id AND id = :id",
{"user_id": user_id, "id": relay_id}, {"user_id": user_id, "id": relay_id},
@ -29,7 +28,7 @@ async def get_relay(user_id: str, relay_id: str) -> Optional[NostrRelay]:
) )
async def get_relay_by_id(relay_id: str) -> Optional[NostrRelay]: async def get_relay_by_id(relay_id: str) -> NostrRelay | None:
"""Note: it does not require `user_id`. Can read any relay. Use it with care.""" """Note: it does not require `user_id`. Can read any relay. Use it with care."""
return await db.fetchone( return await db.fetchone(
"SELECT * FROM nostrrelay.relays WHERE id = :id", "SELECT * FROM nostrrelay.relays WHERE id = :id",
@ -58,7 +57,7 @@ async def get_config_for_all_active_relays() -> dict:
return active_relay_configs return active_relay_configs
async def get_public_relay(relay_id: str) -> Optional[dict]: async def get_public_relay(relay_id: str) -> dict | None:
relay = await db.fetchone( relay = await db.fetchone(
"SELECT * FROM nostrrelay.relays WHERE id = :id", "SELECT * FROM nostrrelay.relays WHERE id = :id",
{"id": relay_id}, {"id": relay_id},
@ -130,7 +129,7 @@ async def get_events(
return events return events
async def get_event(relay_id: str, event_id: str) -> Optional[NostrEvent]: async def get_event(relay_id: str, event_id: str) -> NostrEvent | None:
event = await db.fetchone( event = await db.fetchone(
"SELECT * FROM nostrrelay.events WHERE relay_id = :relay_id AND id = :id", "SELECT * FROM nostrrelay.events WHERE relay_id = :relay_id AND id = :id",
{"relay_id": relay_id, "id": event_id}, {"relay_id": relay_id, "id": event_id},
@ -193,9 +192,20 @@ async def mark_events_deleted(relay_id: str, nostr_filter: NostrFilter):
async def delete_events(relay_id: str, nostr_filter: NostrFilter): async def delete_events(relay_id: str, nostr_filter: NostrFilter):
if nostr_filter.is_empty(): if nostr_filter.is_empty():
return None return None
_, where, values = nostr_filter.to_sql_components(relay_id) inner_joins, where, values = nostr_filter.to_sql_components(relay_id)
if inner_joins:
# Use subquery for DELETE operations with JOINs
subquery = f"""
SELECT nostrrelay.events.id FROM nostrrelay.events
{" ".join(inner_joins)}
WHERE {" AND ".join(where)}
"""
query = f"DELETE FROM nostrrelay.events WHERE id IN ({subquery})"
else:
# Simple DELETE without JOINs
query = f"DELETE FROM nostrrelay.events WHERE {' AND '.join(where)}"
query = f"DELETE from nostrrelay.events WHERE {' AND '.join(where)}"
await db.execute(query, values) await db.execute(query, values)
# todo: delete tags # todo: delete tags
@ -275,7 +285,7 @@ async def delete_account(relay_id: str, pubkey: str):
async def get_account( async def get_account(
relay_id: str, relay_id: str,
pubkey: str, pubkey: str,
) -> Optional[NostrAccount]: ) -> NostrAccount | None:
return await db.fetchone( return await db.fetchone(
""" """
SELECT * FROM nostrrelay.accounts SELECT * FROM nostrrelay.accounts

View file

@ -1,5 +1,3 @@
from typing import Optional
from pydantic import BaseModel from pydantic import BaseModel
@ -16,8 +14,8 @@ class BuyOrder(BaseModel):
class NostrPartialAccount(BaseModel): class NostrPartialAccount(BaseModel):
relay_id: str relay_id: str
pubkey: str pubkey: str
allowed: Optional[bool] = None allowed: bool | None = None
blocked: Optional[bool] = None blocked: bool | None = None
class NostrAccount(BaseModel): class NostrAccount(BaseModel):
@ -44,4 +42,4 @@ class NostrEventTags(BaseModel):
event_id: str event_id: str
name: str name: str
value: str value: str
extra: Optional[str] = None extra: str | None = None

2629
poetry.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,40 +1,34 @@
[tool.poetry] [project]
name = "nostrrelay" name = "nostrrelay"
version = "0.0.0" version = "1.1.0"
description = "nostrrelay" requires-python = ">=3.10,<3.13"
authors = ["dni <dni@lnbits.com>"] description = "LNbits, free and open-source Lightning wallet and accounts system."
authors = [{ name = "Alan Bits", email = "alan@lnbits.com" }]
urls = { Homepage = "https://lnbits.com", Repository = "https://github.com/lnbits/nostrrelay" }
dependencies = [ "lnbits>1" ]
[tool.poetry.dependencies] [tool.poetry]
python = "^3.10 | ^3.9" package-mode = false
lnbits = {allow-prereleases = true, version = "*"}
[tool.poetry.group.dev.dependencies] [dependency-groups]
black = "^24.3.0" dev= [
pytest-asyncio = "^0.21.0" "black",
pytest = "^7.3.2" "pytest-asyncio",
mypy = "^1.5.1" "pytest",
pre-commit = "^3.2.2" "mypy==1.17.1",
ruff = "^0.3.2" "pre-commit",
pytest-md = "^0.2.0" "ruff",
"pytest-md",
[build-system] ]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.mypy] [tool.mypy]
exclude = [ plugins = ["pydantic.mypy"]
"boltz_client"
] [tool.pydantic-mypy]
[[tool.mypy.overrides]] init_forbid_extra = true
module = [ init_typed = true
"lnbits.*", warn_required_dynamic_aliases = true
"loguru.*", warn_untyped_fields = true
"fastapi.*",
"pydantic.*",
"embit.*",
"secp256k1.*",
]
ignore_missing_imports = "True"
[tool.pytest.ini_options] [tool.pytest.ini_options]
log_cli = false log_cli = false
@ -86,8 +80,8 @@ classmethod-decorators = [
# [tool.ruff.lint.extend-per-file-ignores] # [tool.ruff.lint.extend-per-file-ignores]
# "views_api.py" = ["F401"] # "views_api.py" = ["F401"]
# [tool.ruff.lint.mccabe] [tool.ruff.lint.mccabe]
# max-complexity = 10 max-complexity = 11
[tool.ruff.lint.flake8-bugbear] [tool.ruff.lint.flake8-bugbear]
# Allow default arguments like, e.g., `data: List[str] = fastapi.Query(None)`. # Allow default arguments like, e.g., `data: List[str] = fastapi.Query(None)`.

View file

@ -1,6 +1,7 @@
import json import json
import time import time
from typing import Any, Awaitable, Callable, List, Optional from collections.abc import Awaitable, Callable
from typing import Any
from fastapi import WebSocket from fastapi import WebSocket
from lnbits.helpers import urlsafe_short_hash from lnbits.helpers import urlsafe_short_hash
@ -25,17 +26,17 @@ class NostrClientConnection:
def __init__(self, relay_id: str, websocket: WebSocket): def __init__(self, relay_id: str, websocket: WebSocket):
self.websocket = websocket self.websocket = websocket
self.relay_id = relay_id self.relay_id = relay_id
self.filters: List[NostrFilter] = [] self.filters: list[NostrFilter] = []
self.auth_pubkey: Optional[str] = None # set if authenticated self.auth_pubkey: str | None = None # set if authenticated
self._auth_challenge: Optional[str] = None self._auth_challenge: str | None = None
self._auth_challenge_created_at = 0 self._auth_challenge_created_at = 0
self.event_validator = EventValidator(self.relay_id) self.event_validator = EventValidator(self.relay_id)
self.broadcast_event: Optional[ self.broadcast_event: (
Callable[[NostrClientConnection, NostrEvent], Awaitable[None]] Callable[[NostrClientConnection, NostrEvent], Awaitable[None]] | None
] = None ) = None
self.get_client_config: Optional[Callable[[], RelaySpec]] = None self.get_client_config: Callable[[], RelaySpec] | None = None
async def start(self): async def start(self):
await self.websocket.accept() await self.websocket.accept()
@ -50,7 +51,7 @@ class NostrClientConnection:
except Exception as e: except Exception as e:
logger.warning(e) logger.warning(e)
async def stop(self, reason: Optional[str]): async def stop(self, reason: str | None):
message = reason if reason else "Server closed webocket" message = reason if reason else "Server closed webocket"
try: try:
await self._send_msg(["NOTICE", message]) await self._send_msg(["NOTICE", message])
@ -98,7 +99,7 @@ class NostrClientConnection:
if self.broadcast_event: if self.broadcast_event:
await self.broadcast_event(self, e) await self.broadcast_event(self, e)
async def _handle_message(self, data: List) -> List: async def _handle_message(self, data: list) -> list:
if len(data) < 2: if len(data) < 2:
return [] return []
@ -115,9 +116,17 @@ class NostrClientConnection:
await self._handle_event(event) await self._handle_event(event)
return [] return []
if message_type == NostrEventType.REQ: if message_type == NostrEventType.REQ:
if len(data) != 3: if len(data) < 3:
return [] return []
return await self._handle_request(data[1], NostrFilter.parse_obj(data[2])) subscription_id = data[1]
# Handle multiple filters in REQ message
responses = []
for filter_data in data[2:]:
response = await self._handle_request(
subscription_id, NostrFilter.parse_obj(filter_data)
)
responses.extend(response)
return responses
if message_type == NostrEventType.CLOSE: if message_type == NostrEventType.CLOSE:
self._handle_close(data[1]) self._handle_close(data[1])
if message_type == NostrEventType.AUTH: if message_type == NostrEventType.AUTH:
@ -127,7 +136,7 @@ class NostrClientConnection:
async def _handle_event(self, e: NostrEvent): async def _handle_event(self, e: NostrEvent):
logger.info(f"nostr event: [{e.kind}, {e.pubkey}, '{e.content}']") logger.info(f"nostr event: [{e.kind}, {e.pubkey}, '{e.content}']")
resp_nip20: List[Any] = ["OK", e.id] resp_nip20: list[Any] = ["OK", e.id]
if e.is_auth_response_event: if e.is_auth_response_event:
valid, message = self.event_validator.validate_auth_event( valid, message = self.event_validator.validate_auth_event(
@ -160,6 +169,19 @@ class NostrClientConnection:
self.relay_id, self.relay_id,
NostrFilter(kinds=[e.kind], authors=[e.pubkey], until=e.created_at), NostrFilter(kinds=[e.kind], authors=[e.pubkey], until=e.created_at),
) )
if e.is_addressable_event:
# Extract 'd' tag value for addressable replacement (NIP-01)
d_tag_value = next((t[1] for t in e.tags if t[0] == "d"), None)
if d_tag_value:
deletion_filter = NostrFilter(
kinds=[e.kind],
authors=[e.pubkey],
**{"#d": [d_tag_value]}, # type: ignore
until=e.created_at,
)
await delete_events(self.relay_id, deletion_filter)
if not e.is_ephemeral_event: if not e.is_ephemeral_event:
await create_event(e) await create_event(e)
await self._broadcast_event(e) await self._broadcast_event(e)
@ -182,20 +204,73 @@ class NostrClientConnection:
raise Exception("Client not ready!") raise Exception("Client not ready!")
return self.get_client_config() return self.get_client_config()
async def _send_msg(self, data: List): async def _send_msg(self, data: list):
await self.websocket.send_text(json.dumps(data)) await self.websocket.send_text(json.dumps(data))
async def _handle_delete_event(self, event: NostrEvent): async def _handle_delete_event(self, event: NostrEvent):
# NIP 09 # NIP 09 - Handle both regular events (e tags) and parameterized replaceable events (a tags)
nostr_filter = NostrFilter(authors=[event.pubkey])
nostr_filter.ids = [t[1] for t in event.tags if t[0] == "e"] # Get event IDs from 'e' tags (for regular events)
event_ids = [t[1] for t in event.tags if t[0] == "e"]
# Get event addresses from 'a' tags (for parameterized replaceable events)
event_addresses = [t[1] for t in event.tags if t[0] == "a"]
ids_to_delete = []
# Handle regular event deletions (e tags)
if event_ids:
nostr_filter = NostrFilter(authors=[event.pubkey], ids=event_ids)
events_to_delete = await get_events(self.relay_id, nostr_filter, False) events_to_delete = await get_events(self.relay_id, nostr_filter, False)
ids = [e.id for e in events_to_delete if not e.is_delete_event] ids_to_delete.extend(
await mark_events_deleted(self.relay_id, NostrFilter(ids=ids)) [e.id for e in events_to_delete if not e.is_delete_event]
)
# Handle parameterized replaceable event deletions (a tags)
if event_addresses:
for addr in event_addresses:
# Parse address format: kind:pubkey:d-tag
parts = addr.split(":")
if len(parts) == 3:
kind_str, addr_pubkey, d_tag = parts
try:
kind = int(kind_str)
# Only delete if the address pubkey matches the deletion event author
if addr_pubkey == event.pubkey:
# NOTE: Use "#d" alias, not "d" directly (Pydantic Field alias)
nostr_filter = NostrFilter(
authors=[addr_pubkey],
kinds=[kind],
**{"#d": [d_tag]}, # Use alias to set d field
)
events_to_delete = await get_events(
self.relay_id, nostr_filter, False
)
ids_to_delete.extend(
[
e.id
for e in events_to_delete
if not e.is_delete_event
]
)
else:
logger.warning(
f"Deletion request pubkey mismatch: {addr_pubkey} != {event.pubkey}"
)
except ValueError:
logger.warning(f"Invalid kind in address: {addr}")
else:
logger.warning(
f"Invalid address format (expected kind:pubkey:d-tag): {addr}"
)
# Only mark events as deleted if we found specific IDs
if ids_to_delete:
await mark_events_deleted(self.relay_id, NostrFilter(ids=ids_to_delete))
async def _handle_request( async def _handle_request(
self, subscription_id: str, nostr_filter: NostrFilter self, subscription_id: str, nostr_filter: NostrFilter
) -> List: ) -> list:
if self.config.require_auth_filter: if self.config.require_auth_filter:
if not self.auth_pubkey: if not self.auth_pubkey:
return [["AUTH", self._current_auth_challenge()]] return [["AUTH", self._current_auth_challenge()]]

View file

@ -1,5 +1,3 @@
from typing import List
from ..crud import get_config_for_all_active_relays from ..crud import get_config_for_all_active_relays
from .client_connection import NostrClientConnection from .client_connection import NostrClientConnection
from .event import NostrEvent from .event import NostrEvent
@ -47,7 +45,7 @@ class NostrClientManager:
def get_relay_config(self, relay_id: str) -> RelaySpec: def get_relay_config(self, relay_id: str) -> RelaySpec:
return self._active_relays[relay_id] return self._active_relays[relay_id]
def clients(self, relay_id: str) -> List[NostrClientConnection]: def clients(self, relay_id: str) -> list[NostrClientConnection]:
if relay_id not in self._clients: if relay_id not in self._clients:
self._clients[relay_id] = [] self._clients[relay_id] = []
return self._clients[relay_id] return self._clients[relay_id]

View file

@ -2,8 +2,8 @@ import hashlib
import json import json
from enum import Enum from enum import Enum
from coincurve import PublicKeyXOnly
from pydantic import BaseModel, Field from pydantic import BaseModel, Field
from secp256k1 import PublicKey
class NostrEventType(str, Enum): class NostrEventType(str, Enum):
@ -71,6 +71,10 @@ class NostrEvent(BaseModel):
def is_ephemeral_event(self) -> bool: def is_ephemeral_event(self) -> bool:
return self.kind >= 20000 and self.kind < 30000 return self.kind >= 20000 and self.kind < 30000
@property
def is_addressable_event(self) -> bool:
return self.kind >= 30000 and self.kind < 40000
def check_signature(self): def check_signature(self):
event_id = self.event_id event_id = self.event_id
if self.id != event_id: if self.id != event_id:
@ -78,14 +82,15 @@ class NostrEvent(BaseModel):
f"Invalid event id. Expected: '{event_id}' got '{self.id}'" f"Invalid event id. Expected: '{event_id}' got '{self.id}'"
) )
try: try:
pub_key = PublicKey(bytes.fromhex("02" + self.pubkey), True) pub_key = PublicKeyXOnly(bytes.fromhex(self.pubkey))
except Exception as exc: except Exception as exc:
raise ValueError( raise ValueError(
f"Invalid public key: '{self.pubkey}' for event '{self.id}'" f"Invalid public key: '{self.pubkey}' for event '{self.id}'"
) from exc ) from exc
valid_signature = pub_key.schnorr_verify( valid_signature = pub_key.verify(
bytes.fromhex(event_id), bytes.fromhex(self.sig), None, raw=True bytes.fromhex(self.sig),
bytes.fromhex(event_id),
) )
if not valid_signature: if not valid_signature:
raise ValueError(f"Invalid signature: '{self.sig}' for event '{self.id}'") raise ValueError(f"Invalid signature: '{self.sig}' for event '{self.id}'")

View file

@ -1,5 +1,5 @@
import time import time
from typing import Callable, Optional, Tuple from collections.abc import Callable
from ..crud import get_account, get_storage_for_public_key, prune_old_events from ..crud import get_account, get_storage_for_public_key, prune_old_events
from ..helpers import extract_domain from ..helpers import extract_domain
@ -15,11 +15,11 @@ class EventValidator:
self._last_event_timestamp = 0 # in hours self._last_event_timestamp = 0 # in hours
self._event_count_per_timestamp = 0 self._event_count_per_timestamp = 0
self.get_client_config: Optional[Callable[[], RelaySpec]] = None self.get_client_config: Callable[[], RelaySpec] | None = None
async def validate_write( async def validate_write(
self, e: NostrEvent, publisher_pubkey: str self, e: NostrEvent, publisher_pubkey: str
) -> Tuple[bool, str]: ) -> tuple[bool, str]:
valid, message = self._validate_event(e) valid, message = self._validate_event(e)
if not valid: if not valid:
return (valid, message) return (valid, message)
@ -34,8 +34,8 @@ class EventValidator:
return True, "" return True, ""
def validate_auth_event( def validate_auth_event(
self, e: NostrEvent, auth_challenge: Optional[str] self, e: NostrEvent, auth_challenge: str | None
) -> Tuple[bool, str]: ) -> tuple[bool, str]:
valid, message = self._validate_event(e) valid, message = self._validate_event(e)
if not valid: if not valid:
return (valid, message) return (valid, message)
@ -59,7 +59,7 @@ class EventValidator:
raise Exception("EventValidator not ready!") raise Exception("EventValidator not ready!")
return self.get_client_config() return self.get_client_config()
def _validate_event(self, e: NostrEvent) -> Tuple[bool, str]: def _validate_event(self, e: NostrEvent) -> tuple[bool, str]:
if self._exceeded_max_events_per_hour(): if self._exceeded_max_events_per_hour():
return False, "Exceeded max events per hour limit'!" return False, "Exceeded max events per hour limit'!"
@ -76,7 +76,7 @@ class EventValidator:
async def _validate_storage( async def _validate_storage(
self, pubkey: str, event_size_bytes: int self, pubkey: str, event_size_bytes: int
) -> Tuple[bool, str]: ) -> tuple[bool, str]:
if self.config.is_read_only_relay: if self.config.is_read_only_relay:
return False, "Cannot write event, relay is read-only" return False, "Cannot write event, relay is read-only"
@ -124,7 +124,7 @@ class EventValidator:
return self._event_count_per_timestamp > self.config.max_events_per_hour return self._event_count_per_timestamp > self.config.max_events_per_hour
def _created_at_in_range(self, created_at: int) -> Tuple[bool, str]: def _created_at_in_range(self, created_at: int) -> tuple[bool, str]:
current_time = round(time.time()) current_time = round(time.time())
if self.config.created_at_in_past != 0: if self.config.created_at_in_past != 0:
if created_at < (current_time - self.config.created_at_in_past): if created_at < (current_time - self.config.created_at_in_past):

View file

@ -1,5 +1,3 @@
from typing import Optional
from pydantic import BaseModel, Field from pydantic import BaseModel, Field
from .event import NostrEvent from .event import NostrEvent
@ -8,13 +6,14 @@ from .event import NostrEvent
class NostrFilter(BaseModel): class NostrFilter(BaseModel):
e: list[str] = Field(default=[], alias="#e") e: list[str] = Field(default=[], alias="#e")
p: list[str] = Field(default=[], alias="#p") p: list[str] = Field(default=[], alias="#p")
d: list[str] = Field(default=[], alias="#d")
ids: list[str] = [] ids: list[str] = []
authors: list[str] = [] authors: list[str] = []
kinds: list[int] = [] kinds: list[int] = []
subscription_id: Optional[str] = None subscription_id: str | None = None
since: Optional[int] = None since: int | None = None
until: Optional[int] = None until: int | None = None
limit: Optional[int] = None limit: int | None = None
def matches(self, e: NostrEvent) -> bool: def matches(self, e: NostrEvent) -> bool:
# todo: starts with # todo: starts with
@ -30,9 +29,12 @@ class NostrFilter(BaseModel):
if self.until and self.until > 0 and e.created_at > self.until: if self.until and self.until > 0 and e.created_at > self.until:
return False return False
found_e_tag = self.tag_in_list(e.tags, "e") # Check tag filters - only fail if filter is specified and no match found
found_p_tag = self.tag_in_list(e.tags, "p") if not self.tag_in_list(e.tags, "e"):
if not found_e_tag or not found_p_tag: return False
if not self.tag_in_list(e.tags, "p"):
return False
if not self.tag_in_list(e.tags, "d"):
return False return False
return True return True
@ -87,6 +89,17 @@ class NostrFilter(BaseModel):
) )
where.append(f" p_tags.value in ({p_s}) AND p_tags.name = 'p'") where.append(f" p_tags.value in ({p_s}) AND p_tags.name = 'p'")
if len(self.d):
d_s = ",".join([f"'{d}'" for d in self.d])
d_join = (
"INNER JOIN nostrrelay.event_tags d_tags "
"ON nostrrelay.events.id = d_tags.event_id"
)
d_where = f" d_tags.value in ({d_s}) AND d_tags.name = 'd'"
inner_joins.append(d_join)
where.append(d_where)
if len(self.ids) != 0: if len(self.ids) != 0:
ids = ",".join([f"'{_id}'" for _id in self.ids]) ids = ",".join([f"'{_id}'" for _id in self.ids])
where.append(f"id IN ({ids})") where.append(f"id IN ({ids})")

View file

@ -1,5 +1,3 @@
from typing import Optional
from pydantic import BaseModel, Field from pydantic import BaseModel, Field
@ -100,11 +98,11 @@ class RelaySpec(RelayPublicSpec, WalletSpec, AuthSpec):
class NostrRelay(BaseModel): class NostrRelay(BaseModel):
id: str id: str
user_id: Optional[str] = None user_id: str | None = None
name: str name: str
description: Optional[str] = None description: str | None = None
pubkey: Optional[str] = None pubkey: str | None = None
contact: Optional[str] = None contact: str | None = None
active: bool = False active: bool = False
meta: RelaySpec = RelaySpec() meta: RelaySpec = RelaySpec()

View file

@ -1,7 +1,7 @@
import asyncio import asyncio
import inspect import inspect
from typing import List, Optional
import pytest
import pytest_asyncio import pytest_asyncio
from lnbits.db import Database from lnbits.db import Database
from loguru import logger from loguru import logger
@ -14,11 +14,11 @@ from .helpers import get_fixtures
class EventFixture(BaseModel): class EventFixture(BaseModel):
name: str name: str
exception: Optional[str] exception: str | None
data: NostrEvent data: NostrEvent
@pytest_asyncio.fixture(scope="session") @pytest.fixture(scope="session")
def event_loop(): def event_loop():
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
yield loop yield loop
@ -27,22 +27,27 @@ def event_loop():
@pytest_asyncio.fixture(scope="session", autouse=True) @pytest_asyncio.fixture(scope="session", autouse=True)
async def migrate_db(): async def migrate_db():
print("#### 999")
db = Database("ext_nostrrelay") db = Database("ext_nostrrelay")
await db.execute("DROP TABLE IF EXISTS nostrrelay.events;")
await db.execute("DROP TABLE IF EXISTS nostrrelay.relays;")
await db.execute("DROP TABLE IF EXISTS nostrrelay.event_tags;")
await db.execute("DROP TABLE IF EXISTS nostrrelay.accounts;")
# check if exists else skip migrations
for key, migrate in inspect.getmembers(migrations, inspect.isfunction): for key, migrate in inspect.getmembers(migrations, inspect.isfunction):
print("### 1000")
logger.info(f"Running migration '{key}'.") logger.info(f"Running migration '{key}'.")
await migrate(db) await migrate(db)
return migrations
yield db
@pytest_asyncio.fixture(scope="session") @pytest.fixture(scope="session")
def valid_events(migrate_db) -> List[EventFixture]: def valid_events(migrate_db) -> list[EventFixture]:
data = get_fixtures("events") data = get_fixtures("events")
return [EventFixture.parse_obj(e) for e in data["valid"]] return [EventFixture.parse_obj(e) for e in data["valid"]]
@pytest_asyncio.fixture(scope="session") @pytest.fixture(scope="session")
def invalid_events(migrate_db) -> List[EventFixture]: def invalid_events(migrate_db) -> list[EventFixture]:
data = get_fixtures("events") data = get_fixtures("events")
return [EventFixture.parse_obj(e) for e in data["invalid"]] return [EventFixture.parse_obj(e) for e in data["invalid"]]

View file

@ -1,6 +1,5 @@
import asyncio import asyncio
from json import dumps, loads from json import dumps, loads
from typing import Optional
import pytest import pytest
from fastapi import WebSocket from fastapi import WebSocket
@ -41,7 +40,7 @@ class MockWebSocket(WebSocket):
async def wire_mock_data(self, data: dict): async def wire_mock_data(self, data: dict):
await self.fake_wire.put(dumps(data)) await self.fake_wire.put(dumps(data))
async def close(self, code: int = 1000, reason: Optional[str] = None) -> None: async def close(self, code: int = 1000, reason: str | None = None) -> None:
logger.info(f"{code}: {reason}") logger.info(f"{code}: {reason}")

View file

@ -1,5 +1,4 @@
import json import json
from typing import List
import pytest import pytest
from loguru import logger from loguru import logger
@ -16,23 +15,23 @@ from .conftest import EventFixture
RELAY_ID = "r1" RELAY_ID = "r1"
def test_valid_event_id_and_signature(valid_events: List[EventFixture]): def test_valid_event_id_and_signature(valid_events: list[EventFixture]):
for f in valid_events: for f in valid_events:
try: try:
f.data.check_signature() f.data.check_signature()
except Exception as e: except Exception as e:
logger.error(f"Invalid 'id' ot 'signature' for fixture: '{f.name}'") logger.error(f"Invalid 'id' of 'signature' for fixture: '{f.name}'")
raise e raise e
def test_invalid_event_id_and_signature(invalid_events: List[EventFixture]): def test_invalid_event_id_and_signature(invalid_events: list[EventFixture]):
for f in invalid_events: for f in invalid_events:
with pytest.raises(ValueError, match=f.exception): with pytest.raises(ValueError, match=f.exception):
f.data.check_signature() f.data.check_signature()
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_valid_event_crud(valid_events: List[EventFixture]): async def test_valid_event_crud(valid_events: list[EventFixture]):
author = "a24496bca5dd73300f4e5d5d346c73132b7354c597fcbb6509891747b4689211" author = "a24496bca5dd73300f4e5d5d346c73132b7354c597fcbb6509891747b4689211"
event_id = "3219eec7427e365585d5adf26f5d2dd2709d3f0f2c0e1f79dc9021e951c67d96" event_id = "3219eec7427e365585d5adf26f5d2dd2709d3f0f2c0e1f79dc9021e951c67d96"
reply_event_id = "6b2b6cb9c72caaf3dfbc5baa5e68d75ac62f38ec011b36cc83832218c36e4894" reply_event_id = "6b2b6cb9c72caaf3dfbc5baa5e68d75ac62f38ec011b36cc83832218c36e4894"
@ -65,7 +64,7 @@ async def get_by_id(data: NostrEvent, test_name: str):
), f"Restored event is different for fixture '{test_name}'" ), f"Restored event is different for fixture '{test_name}'"
async def filter_by_id(all_events: List[NostrEvent], data: NostrEvent, test_name: str): async def filter_by_id(all_events: list[NostrEvent], data: NostrEvent, test_name: str):
nostr_filter = NostrFilter(ids=[data.id]) nostr_filter = NostrFilter(ids=[data.id])
events = await get_events(RELAY_ID, nostr_filter) events = await get_events(RELAY_ID, nostr_filter)
@ -81,7 +80,7 @@ async def filter_by_id(all_events: List[NostrEvent], data: NostrEvent, test_name
), f"Filtered event is different for fixture '{test_name}'" ), f"Filtered event is different for fixture '{test_name}'"
async def filter_by_author(all_events: List[NostrEvent], author): async def filter_by_author(all_events: list[NostrEvent], author):
nostr_filter = NostrFilter(authors=[author]) nostr_filter = NostrFilter(authors=[author])
events_by_author = await get_events(RELAY_ID, nostr_filter) events_by_author = await get_events(RELAY_ID, nostr_filter)
assert len(events_by_author) == 5, "Failed to query by authors" assert len(events_by_author) == 5, "Failed to query by authors"
@ -90,7 +89,7 @@ async def filter_by_author(all_events: List[NostrEvent], author):
assert len(filtered_events) == 5, "Failed to filter by authors" assert len(filtered_events) == 5, "Failed to filter by authors"
async def filter_by_tag_p(all_events: List[NostrEvent], author): async def filter_by_tag_p(all_events: list[NostrEvent], author):
# todo: check why constructor does not work for fields with aliases (#e, #p) # todo: check why constructor does not work for fields with aliases (#e, #p)
nostr_filter = NostrFilter() nostr_filter = NostrFilter()
nostr_filter.p.append(author) nostr_filter.p.append(author)
@ -102,7 +101,7 @@ async def filter_by_tag_p(all_events: List[NostrEvent], author):
assert len(filtered_events) == 5, "Failed to filter by tag 'p'" assert len(filtered_events) == 5, "Failed to filter by tag 'p'"
async def filter_by_tag_e(all_events: List[NostrEvent], event_id): async def filter_by_tag_e(all_events: list[NostrEvent], event_id):
nostr_filter = NostrFilter() nostr_filter = NostrFilter()
nostr_filter.e.append(event_id) nostr_filter.e.append(event_id)
@ -114,7 +113,7 @@ async def filter_by_tag_e(all_events: List[NostrEvent], event_id):
async def filter_by_tag_e_and_p( async def filter_by_tag_e_and_p(
all_events: List[NostrEvent], author, event_id, reply_event_id all_events: list[NostrEvent], author, event_id, reply_event_id
): ):
nostr_filter = NostrFilter() nostr_filter = NostrFilter()
nostr_filter.p.append(author) nostr_filter.p.append(author)
@ -134,7 +133,7 @@ async def filter_by_tag_e_and_p(
async def filter_by_tag_e_p_and_author( async def filter_by_tag_e_p_and_author(
all_events: List[NostrEvent], author, event_id, reply_event_id all_events: list[NostrEvent], author, event_id, reply_event_id
): ):
nostr_filter = NostrFilter(authors=[author]) nostr_filter = NostrFilter(authors=[author])
nostr_filter.p.append(author) nostr_filter.p.append(author)

2299
uv.lock generated Normal file

File diff suppressed because it is too large Load diff

View file

@ -1,5 +1,4 @@
from http import HTTPStatus from http import HTTPStatus
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Request, WebSocket from fastapi import APIRouter, Depends, HTTPException, Request, WebSocket
from lnbits.core.crud import get_user from lnbits.core.crud import get_user
@ -138,7 +137,7 @@ async def api_get_relay_info() -> JSONResponse:
@nostrrelay_api_router.get("/api/v1/relay/{relay_id}") @nostrrelay_api_router.get("/api/v1/relay/{relay_id}")
async def api_get_relay( async def api_get_relay(
relay_id: str, wallet: WalletTypeInfo = Depends(require_invoice_key) relay_id: str, wallet: WalletTypeInfo = Depends(require_invoice_key)
) -> Optional[NostrRelay]: ) -> NostrRelay | None:
relay = await get_relay(wallet.wallet.user, relay_id) relay = await get_relay(wallet.wallet.user, relay_id)
if not relay: if not relay:
raise HTTPException( raise HTTPException(