v0.1.0-dev2: Full TPDB integration with auto-fetch and comprehensive docs
Major Features: - ✅ Complete TPDB scraper implementation with real API calls - ✅ Auto-fetch on cache miss: search commands now automatically import from TPDB when not found locally - ✅ Comprehensive documentation (5 markdown files for Bookstack) - ✅ Import commands for performers, studios, and scenes - ✅ Fixed JSON type mismatches (aliases array, studio numeric IDs) Changes: 1. TPDB Scraper (internal/scraper/tpdb/): - types.go: Full API response structures with correct types - PerformerResponse.Aliases: string → []string (TPDB returns array) - StudioResponse.ID: string → int (TPDB returns numeric IDs) - SiteInfo.ID: string → int (scenes reference studios by number) - mapper.go: Maps TPDB responses to internal models - Converts aliases array to comma-separated string - Converts numeric studio IDs to strings using strconv.Itoa() - scraper.go: Real HTTP client with Bearer token auth - SearchPerformers, SearchStudios, SearchScenes implemented - GetPerformerByID, GetStudioByID, GetSceneByID implemented 2. CLI Auto-Fetch (cmd/goondex/main.go): - performer-search: Auto-fetches from TPDB if local DB empty - studio-search: Auto-fetches from TPDB if local DB empty - scene-search: Auto-fetches basic metadata (no relationships) - Graceful handling of missing TPDB_API_KEY - Import → search again to get local IDs 3. Documentation (docs/): - INDEX.md: Documentation overview and navigation - ARCHITECTURE.md: System design, data flow, component diagrams - DATABASE_SCHEMA.md: Complete schema with relationships and indexes - CLI_REFERENCE.md: All commands with examples - TPDB_INTEGRATION.md: API guide, data mapping, best practices 4. Fixes: - .gitignore: Fixed pattern to allow cmd/goondex/* and cmd/goondexd/* - README: Updated to reflect TPDB integration and auto-fetch Testing: - ✅ performer-search "Riley Reid" - auto-fetched 2 performers, cached - ✅ studio-search "Brazzers" - auto-fetched 12 studios, cached - ✅ Aliases now display correctly as comma-separated list - ✅ Studio IDs properly converted from numeric to string API Integration: - Base URL: https://api.theporndb.net - Authentication: Bearer token via TPDB_API_KEY env var - Endpoints: /performers, /sites, /scenes - Rate limiting handled with warnings 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
parent
2e747c6660
commit
f7d82cd207
4
.gitignore
vendored
4
.gitignore
vendored
|
|
@ -1,8 +1,8 @@
|
|||
# Binaries
|
||||
/bin/
|
||||
/dist/
|
||||
goondex
|
||||
goondexd
|
||||
/goondex
|
||||
/goondexd
|
||||
*.exe
|
||||
*.dll
|
||||
*.so
|
||||
|
|
|
|||
81
README.md
81
README.md
|
|
@ -2,19 +2,22 @@
|
|||
|
||||
**Fast, local-first media indexer for adult content.**
|
||||
|
||||
Goondex ingests metadata from external sources (TPDB, etc.), normalizes it, and stores it in a small SQLite database for quick search via CLI/TUI and background daemon tasks.
|
||||
Goondex ingests metadata from external sources (ThePornDB, etc.), normalizes it, and stores it in a small SQLite database for quick search via CLI/TUI and background daemon tasks.
|
||||
|
||||
## Version
|
||||
|
||||
**v0.1.0-dev1** - Initial development release
|
||||
**v0.1.0-dev2** - TPDB Integration Release
|
||||
|
||||
## Features (v0.1.0-dev1)
|
||||
## Features (v0.1.0-dev2)
|
||||
|
||||
- SQLite database with WAL mode for performers, studios, scenes, and tags
|
||||
- CLI search commands for local database queries
|
||||
- Pluggable scraper architecture (TPDB client stubbed)
|
||||
- Configuration via YAML files
|
||||
- Stash-inspired metadata resolution strategies (coming soon)
|
||||
- ✅ SQLite database with WAL mode for performers, studios, scenes, and tags
|
||||
- ✅ **Full TPDB scraper integration** with real API calls
|
||||
- ✅ **CLI import commands** - Fetch data directly from ThePornDB
|
||||
- ✅ CLI search commands for local database queries
|
||||
- ✅ Automatic relationship management (scenes ↔ performers, scenes ↔ tags)
|
||||
- ✅ Pluggable scraper architecture
|
||||
- ✅ Configuration via YAML files
|
||||
- ⏳ Stash-inspired metadata resolution strategies (coming in v0.2.x)
|
||||
|
||||
## Architecture
|
||||
|
||||
|
|
@ -57,22 +60,47 @@ Edit configuration files in `config/`:
|
|||
|
||||
## Usage
|
||||
|
||||
### CLI Commands (v0.1.0-dev1)
|
||||
### Quick Start
|
||||
|
||||
```bash
|
||||
# Search performers in local DB
|
||||
./bin/goondex performer-search "Riley Reid"
|
||||
# 1. Set your TPDB API key
|
||||
export TPDB_API_KEY="your-api-key-here"
|
||||
|
||||
# Search studios in local DB
|
||||
# 2. Import some data from ThePornDB
|
||||
./bin/goondex import performer "Riley Reid"
|
||||
./bin/goondex import studio "Brazzers"
|
||||
./bin/goondex import scene "Big Wet Butts"
|
||||
|
||||
# 3. Search your local database
|
||||
./bin/goondex performer-search "Riley"
|
||||
./bin/goondex studio-search "Brazzers"
|
||||
|
||||
# Search scenes in local DB
|
||||
./bin/goondex scene-search "Some Title"
|
||||
|
||||
# Show version
|
||||
./bin/goondex version
|
||||
./bin/goondex scene-search "Big Wet"
|
||||
```
|
||||
|
||||
### All Commands
|
||||
|
||||
**Import from ThePornDB** (requires `TPDB_API_KEY`):
|
||||
```bash
|
||||
./bin/goondex import performer [query] # Import performers
|
||||
./bin/goondex import studio [query] # Import studios
|
||||
./bin/goondex import scene [query] # Import scenes (+ performers, tags, studio)
|
||||
```
|
||||
|
||||
**Search Local Database**:
|
||||
```bash
|
||||
./bin/goondex performer-search [query] # Search performers
|
||||
./bin/goondex studio-search [query] # Search studios
|
||||
./bin/goondex scene-search [query] # Search scenes
|
||||
```
|
||||
|
||||
**Other**:
|
||||
```bash
|
||||
./bin/goondex version # Show version
|
||||
./bin/goondex --help # Show help
|
||||
```
|
||||
|
||||
See [CLI Reference](docs/CLI_REFERENCE.md) for complete documentation.
|
||||
|
||||
## Database Schema
|
||||
|
||||
- **performers**: id, name, aliases, nationality, country, gender, images, bio
|
||||
|
|
@ -82,13 +110,24 @@ Edit configuration files in `config/`:
|
|||
- **scene_performers**: junction table for scenes ↔ performers
|
||||
- **scene_tags**: junction table for scenes ↔ tags
|
||||
|
||||
## Documentation
|
||||
|
||||
Comprehensive documentation is available in the [`docs/`](docs/) directory:
|
||||
|
||||
- **[Index](docs/INDEX.md)** - Documentation overview
|
||||
- **[Architecture](docs/ARCHITECTURE.md)** - System design and components
|
||||
- **[Database Schema](docs/DATABASE_SCHEMA.md)** - Complete schema reference
|
||||
- **[CLI Reference](docs/CLI_REFERENCE.md)** - All commands and usage
|
||||
- **[TPDB Integration](docs/TPDB_INTEGRATION.md)** - ThePornDB API guide
|
||||
|
||||
## Roadmap
|
||||
|
||||
### v0.1.x
|
||||
### v0.1.x (Current)
|
||||
- [x] CLI search commands
|
||||
- [x] SQLite stores for all entities
|
||||
- [ ] TPDB scraper implementation
|
||||
- [ ] Basic metadata strategies
|
||||
- [x] **TPDB scraper implementation with real API integration**
|
||||
- [x] **Import commands (performer, studio, scene)**
|
||||
- [x] **Comprehensive documentation**
|
||||
- [ ] Image cache
|
||||
|
||||
### v0.2.x
|
||||
|
|
|
|||
541
cmd/goondex/main.go
Normal file
541
cmd/goondex/main.go
Normal file
|
|
@ -0,0 +1,541 @@
|
|||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"os"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
"git.leaktechnologies.dev/stu/Goondex/internal/db"
|
||||
"git.leaktechnologies.dev/stu/Goondex/internal/scraper/tpdb"
|
||||
)
|
||||
|
||||
var (
|
||||
dbPath string
|
||||
rootCmd = &cobra.Command{
|
||||
Use: "goondex",
|
||||
Short: "Goondex - Fast, local-first media indexer",
|
||||
Long: `Goondex is a fast, local-first media indexer for adult content that ingests metadata from external sources.`,
|
||||
}
|
||||
)
|
||||
|
||||
func init() {
|
||||
rootCmd.PersistentFlags().StringVar(&dbPath, "db", "./goondex.db", "Path to SQLite database")
|
||||
|
||||
// Add subcommands
|
||||
rootCmd.AddCommand(performerSearchCmd)
|
||||
rootCmd.AddCommand(studioSearchCmd)
|
||||
rootCmd.AddCommand(sceneSearchCmd)
|
||||
rootCmd.AddCommand(importCmd)
|
||||
rootCmd.AddCommand(versionCmd)
|
||||
}
|
||||
|
||||
// Import command with subcommands
|
||||
var importCmd = &cobra.Command{
|
||||
Use: "import",
|
||||
Short: "Import data from external sources (TPDB)",
|
||||
Long: `Import performers, studios, and scenes from ThePornDB into your local database.`,
|
||||
}
|
||||
|
||||
func init() {
|
||||
importCmd.AddCommand(importPerformerCmd)
|
||||
importCmd.AddCommand(importStudioCmd)
|
||||
importCmd.AddCommand(importSceneCmd)
|
||||
}
|
||||
|
||||
func main() {
|
||||
if err := rootCmd.Execute(); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Error: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
func getDB() (*db.DB, error) {
|
||||
database, err := db.Open(dbPath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to open database: %w", err)
|
||||
}
|
||||
return database, nil
|
||||
}
|
||||
|
||||
var versionCmd = &cobra.Command{
|
||||
Use: "version",
|
||||
Short: "Print version information",
|
||||
Run: func(cmd *cobra.Command, args []string) {
|
||||
fmt.Println("Goondex v0.1.0-dev2")
|
||||
},
|
||||
}
|
||||
|
||||
var performerSearchCmd = &cobra.Command{
|
||||
Use: "performer-search [query]",
|
||||
Short: "Search for performers (auto-fetches from TPDB if not in local database)",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
store := db.NewPerformerStore(database)
|
||||
performers, err := store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed: %w", err)
|
||||
}
|
||||
|
||||
// If no local results, try fetching from TPDB
|
||||
if len(performers) == 0 {
|
||||
fmt.Printf("No local results found. Searching TPDB for '%s'...\n", query)
|
||||
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
fmt.Println("⚠ TPDB_API_KEY not set. Cannot fetch from TPDB.")
|
||||
fmt.Println("Set it with: export TPDB_API_KEY=\"your-key\"")
|
||||
return nil
|
||||
}
|
||||
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
tpdbPerformers, err := scraper.SearchPerformers(context.Background(), query)
|
||||
if err != nil {
|
||||
fmt.Printf("⚠ TPDB search failed: %v\n", err)
|
||||
return nil
|
||||
}
|
||||
|
||||
if len(tpdbPerformers) == 0 {
|
||||
fmt.Println("No performers found on TPDB either.")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d performer(s) on TPDB. Importing...\n\n", len(tpdbPerformers))
|
||||
|
||||
// Import from TPDB
|
||||
imported := 0
|
||||
for _, p := range tpdbPerformers {
|
||||
if err := store.Create(&p); err != nil {
|
||||
fmt.Printf("⚠ Failed to import %s: %v\n", p.Name, err)
|
||||
continue
|
||||
}
|
||||
imported++
|
||||
}
|
||||
|
||||
// Search again to get the imported performers with their IDs
|
||||
performers, err = store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed after import: %w", err)
|
||||
}
|
||||
|
||||
fmt.Printf("✓ Imported %d performer(s)\n\n", imported)
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d performer(s):\n\n", len(performers))
|
||||
for _, p := range performers {
|
||||
fmt.Printf("ID: %d\n", p.ID)
|
||||
fmt.Printf("Name: %s\n", p.Name)
|
||||
if p.Aliases != "" {
|
||||
fmt.Printf("Aliases: %s\n", p.Aliases)
|
||||
}
|
||||
if p.Country != "" {
|
||||
fmt.Printf("Country: %s\n", p.Country)
|
||||
}
|
||||
if p.Gender != "" {
|
||||
fmt.Printf("Gender: %s\n", p.Gender)
|
||||
}
|
||||
if p.Source != "" {
|
||||
fmt.Printf("Source: %s (ID: %s)\n", p.Source, p.SourceID)
|
||||
}
|
||||
fmt.Println("---")
|
||||
}
|
||||
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
var studioSearchCmd = &cobra.Command{
|
||||
Use: "studio-search [query]",
|
||||
Short: "Search for studios (auto-fetches from TPDB if not in local database)",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
store := db.NewStudioStore(database)
|
||||
studios, err := store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed: %w", err)
|
||||
}
|
||||
|
||||
// If no local results, try fetching from TPDB
|
||||
if len(studios) == 0 {
|
||||
fmt.Printf("No local results found. Searching TPDB for '%s'...\n", query)
|
||||
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
fmt.Println("⚠ TPDB_API_KEY not set. Cannot fetch from TPDB.")
|
||||
fmt.Println("Set it with: export TPDB_API_KEY=\"your-key\"")
|
||||
return nil
|
||||
}
|
||||
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
tpdbStudios, err := scraper.SearchStudios(context.Background(), query)
|
||||
if err != nil {
|
||||
fmt.Printf("⚠ TPDB search failed: %v\n", err)
|
||||
return nil
|
||||
}
|
||||
|
||||
if len(tpdbStudios) == 0 {
|
||||
fmt.Println("No studios found on TPDB either.")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d studio(s) on TPDB. Importing...\n\n", len(tpdbStudios))
|
||||
|
||||
// Import from TPDB
|
||||
imported := 0
|
||||
for _, s := range tpdbStudios {
|
||||
if err := store.Create(&s); err != nil {
|
||||
fmt.Printf("⚠ Failed to import %s: %v\n", s.Name, err)
|
||||
continue
|
||||
}
|
||||
imported++
|
||||
}
|
||||
|
||||
// Search again to get the imported studios with their IDs
|
||||
studios, err = store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed after import: %w", err)
|
||||
}
|
||||
|
||||
fmt.Printf("✓ Imported %d studio(s)\n\n", imported)
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d studio(s):\n\n", len(studios))
|
||||
for _, s := range studios {
|
||||
fmt.Printf("ID: %d\n", s.ID)
|
||||
fmt.Printf("Name: %s\n", s.Name)
|
||||
if s.Description != "" {
|
||||
fmt.Printf("Description: %s\n", s.Description)
|
||||
}
|
||||
if s.Source != "" {
|
||||
fmt.Printf("Source: %s (ID: %s)\n", s.Source, s.SourceID)
|
||||
}
|
||||
fmt.Println("---")
|
||||
}
|
||||
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
var sceneSearchCmd = &cobra.Command{
|
||||
Use: "scene-search [query]",
|
||||
Short: "Search for scenes (auto-fetches from TPDB if not in local database)",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
store := db.NewSceneStore(database)
|
||||
scenes, err := store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed: %w", err)
|
||||
}
|
||||
|
||||
// If no local results, try fetching from TPDB
|
||||
if len(scenes) == 0 {
|
||||
fmt.Printf("No local results found. Searching TPDB for '%s'...\n", query)
|
||||
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
fmt.Println("⚠ TPDB_API_KEY not set. Cannot fetch from TPDB.")
|
||||
fmt.Println("Set it with: export TPDB_API_KEY=\"your-key\"")
|
||||
return nil
|
||||
}
|
||||
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
tpdbScenes, err := scraper.SearchScenes(context.Background(), query)
|
||||
if err != nil {
|
||||
fmt.Printf("⚠ TPDB search failed: %v\n", err)
|
||||
return nil
|
||||
}
|
||||
|
||||
if len(tpdbScenes) == 0 {
|
||||
fmt.Println("No scenes found on TPDB either.")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d scene(s) on TPDB. Importing (basic metadata only)...\n\n", len(tpdbScenes))
|
||||
|
||||
// Import scenes (simplified - just scene metadata, no relationships)
|
||||
imported := 0
|
||||
for _, sc := range tpdbScenes {
|
||||
// Clear relationships to avoid complexity in auto-import
|
||||
sc.Performers = nil
|
||||
sc.Tags = nil
|
||||
sc.Studio = nil
|
||||
sc.StudioID = nil
|
||||
|
||||
if err := store.Create(&sc); err != nil {
|
||||
fmt.Printf("⚠ Failed to import %s: %v\n", sc.Title, err)
|
||||
continue
|
||||
}
|
||||
imported++
|
||||
}
|
||||
|
||||
// Search again to get the imported scenes with their IDs
|
||||
scenes, err = store.Search(query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("search failed after import: %w", err)
|
||||
}
|
||||
|
||||
fmt.Printf("✓ Imported %d scene(s) (use 'import scene' for full metadata with relationships)\n\n", imported)
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d scene(s):\n\n", len(scenes))
|
||||
for _, sc := range scenes {
|
||||
fmt.Printf("ID: %d\n", sc.ID)
|
||||
fmt.Printf("Title: %s\n", sc.Title)
|
||||
if sc.Code != "" {
|
||||
fmt.Printf("Code: %s\n", sc.Code)
|
||||
}
|
||||
if sc.Date != "" {
|
||||
fmt.Printf("Date: %s\n", sc.Date)
|
||||
}
|
||||
if sc.Description != "" {
|
||||
fmt.Printf("Description: %s\n", sc.Description)
|
||||
}
|
||||
if sc.Source != "" {
|
||||
fmt.Printf("Source: %s (ID: %s)\n", sc.Source, sc.SourceID)
|
||||
}
|
||||
fmt.Println("---")
|
||||
}
|
||||
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
var importPerformerCmd = &cobra.Command{
|
||||
Use: "performer [query]",
|
||||
Short: "Search TPDB for performers and import them to local database",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
// Get API key from environment
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
return fmt.Errorf("TPDB_API_KEY environment variable is not set")
|
||||
}
|
||||
|
||||
// Create TPDB scraper
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
|
||||
// Search TPDB
|
||||
fmt.Printf("Searching TPDB for performers matching '%s'...\n", query)
|
||||
performers, err := scraper.SearchPerformers(context.Background(), query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to search TPDB: %w", err)
|
||||
}
|
||||
|
||||
if len(performers) == 0 {
|
||||
fmt.Println("No performers found on TPDB")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d performer(s) on TPDB\n\n", len(performers))
|
||||
|
||||
// Open database
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
store := db.NewPerformerStore(database)
|
||||
|
||||
// Import each performer
|
||||
imported := 0
|
||||
for _, p := range performers {
|
||||
fmt.Printf("Importing: %s (TPDB ID: %s)\n", p.Name, p.SourceID)
|
||||
if err := store.Create(&p); err != nil {
|
||||
fmt.Printf(" ⚠ Warning: Failed to import: %v\n", err)
|
||||
continue
|
||||
}
|
||||
fmt.Printf(" ✓ Imported with local ID: %d\n", p.ID)
|
||||
imported++
|
||||
}
|
||||
|
||||
fmt.Printf("\n✓ Successfully imported %d/%d performers\n", imported, len(performers))
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
var importStudioCmd = &cobra.Command{
|
||||
Use: "studio [query]",
|
||||
Short: "Search TPDB for studios and import them to local database",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
// Get API key from environment
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
return fmt.Errorf("TPDB_API_KEY environment variable is not set")
|
||||
}
|
||||
|
||||
// Create TPDB scraper
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
|
||||
// Search TPDB
|
||||
fmt.Printf("Searching TPDB for studios matching '%s'...\n", query)
|
||||
studios, err := scraper.SearchStudios(context.Background(), query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to search TPDB: %w", err)
|
||||
}
|
||||
|
||||
if len(studios) == 0 {
|
||||
fmt.Println("No studios found on TPDB")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d studio(s) on TPDB\n\n", len(studios))
|
||||
|
||||
// Open database
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
store := db.NewStudioStore(database)
|
||||
|
||||
// Import each studio
|
||||
imported := 0
|
||||
for _, s := range studios {
|
||||
fmt.Printf("Importing: %s (TPDB ID: %s)\n", s.Name, s.SourceID)
|
||||
if err := store.Create(&s); err != nil {
|
||||
fmt.Printf(" ⚠ Warning: Failed to import: %v\n", err)
|
||||
continue
|
||||
}
|
||||
fmt.Printf(" ✓ Imported with local ID: %d\n", s.ID)
|
||||
imported++
|
||||
}
|
||||
|
||||
fmt.Printf("\n✓ Successfully imported %d/%d studios\n", imported, len(studios))
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
var importSceneCmd = &cobra.Command{
|
||||
Use: "scene [query]",
|
||||
Short: "Search TPDB for scenes and import them to local database",
|
||||
Args: cobra.MinimumNArgs(1),
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
query := args[0]
|
||||
|
||||
// Get API key from environment
|
||||
apiKey := os.Getenv("TPDB_API_KEY")
|
||||
if apiKey == "" {
|
||||
return fmt.Errorf("TPDB_API_KEY environment variable is not set")
|
||||
}
|
||||
|
||||
// Create TPDB scraper
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
|
||||
// Search TPDB
|
||||
fmt.Printf("Searching TPDB for scenes matching '%s'...\n", query)
|
||||
scenes, err := scraper.SearchScenes(context.Background(), query)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to search TPDB: %w", err)
|
||||
}
|
||||
|
||||
if len(scenes) == 0 {
|
||||
fmt.Println("No scenes found on TPDB")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Printf("Found %d scene(s) on TPDB\n\n", len(scenes))
|
||||
|
||||
// Open database
|
||||
database, err := getDB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer database.Close()
|
||||
|
||||
sceneStore := db.NewSceneStore(database)
|
||||
performerStore := db.NewPerformerStore(database)
|
||||
studioStore := db.NewStudioStore(database)
|
||||
tagStore := db.NewTagStore(database)
|
||||
|
||||
// Import each scene
|
||||
imported := 0
|
||||
for _, sc := range scenes {
|
||||
fmt.Printf("Importing: %s (TPDB ID: %s)\n", sc.Title, sc.SourceID)
|
||||
|
||||
// Import studio if present
|
||||
if sc.Studio != nil {
|
||||
if err := studioStore.Create(sc.Studio); err != nil {
|
||||
// Studio might already exist, try to fetch it
|
||||
studios, _ := studioStore.Search(sc.Studio.Name)
|
||||
if len(studios) > 0 {
|
||||
sc.StudioID = &studios[0].ID
|
||||
}
|
||||
} else {
|
||||
sc.StudioID = &sc.Studio.ID
|
||||
}
|
||||
}
|
||||
|
||||
// Create scene
|
||||
if err := sceneStore.Create(&sc); err != nil {
|
||||
fmt.Printf(" ⚠ Warning: Failed to import scene: %v\n", err)
|
||||
continue
|
||||
}
|
||||
|
||||
// Import and link performers
|
||||
for _, p := range sc.Performers {
|
||||
if err := performerStore.Create(&p); err != nil {
|
||||
// Performer might already exist
|
||||
performers, _ := performerStore.Search(p.Name)
|
||||
if len(performers) > 0 {
|
||||
p.ID = performers[0].ID
|
||||
}
|
||||
}
|
||||
if p.ID > 0 {
|
||||
sceneStore.AddPerformer(sc.ID, p.ID)
|
||||
}
|
||||
}
|
||||
|
||||
// Import and link tags
|
||||
for _, t := range sc.Tags {
|
||||
existing, _ := tagStore.GetByName(t.Name)
|
||||
if existing != nil {
|
||||
t.ID = existing.ID
|
||||
} else {
|
||||
if err := tagStore.Create(&t); err != nil {
|
||||
continue
|
||||
}
|
||||
}
|
||||
if t.ID > 0 {
|
||||
sceneStore.AddTag(sc.ID, t.ID)
|
||||
}
|
||||
}
|
||||
|
||||
fmt.Printf(" ✓ Imported with local ID: %d\n", sc.ID)
|
||||
imported++
|
||||
}
|
||||
|
||||
fmt.Printf("\n✓ Successfully imported %d/%d scenes\n", imported, len(scenes))
|
||||
return nil
|
||||
},
|
||||
}
|
||||
10
cmd/goondexd/main.go
Normal file
10
cmd/goondexd/main.go
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
func main() {
|
||||
fmt.Println("goondexd daemon - Coming in v0.2.0")
|
||||
fmt.Println("For now, use the 'goondex' CLI tool")
|
||||
}
|
||||
290
docs/ARCHITECTURE.md
Normal file
290
docs/ARCHITECTURE.md
Normal file
|
|
@ -0,0 +1,290 @@
|
|||
# Architecture Overview
|
||||
|
||||
## High-Level Architecture
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ User Interface │
|
||||
│ (CLI / TUI / Web) │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Application Layer │
|
||||
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
||||
│ │ Commands │ │ Metadata │ │ Images │ │
|
||||
│ │ (Import/ │ │ Resolver │ │ (Cache/ │ │
|
||||
│ │ Search) │ │ │ │ Fetch) │ │
|
||||
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Scraper Layer │
|
||||
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
||||
│ │ TPDB │ │ AE │ │ Future │ │
|
||||
│ │ Scraper │ │ Scraper │ │ Sources │ │
|
||||
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
||||
│ Scraper Registry │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Data Layer (DB) │
|
||||
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
||||
│ │ Performers │ │ Studios │ │ Scenes │ │
|
||||
│ │ Store │ │ Store │ │ Store │ │
|
||||
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
||||
│ ┌──────────────┐ ┌──────────────────────────────────┐ │
|
||||
│ │ Tags │ │ Junction Tables │ │
|
||||
│ │ Store │ │ (scene_performers, scene_tags) │ │
|
||||
│ └──────────────┘ └──────────────────────────────────┘ │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ SQLite Database │
|
||||
│ (WAL mode, Foreign Keys enabled) │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Component Breakdown
|
||||
|
||||
### 1. User Interface Layer
|
||||
|
||||
#### CLI (`cmd/goondex`)
|
||||
- Command-line interface built with Cobra
|
||||
- Commands:
|
||||
- `search` - Search local database
|
||||
- `import` - Fetch from external sources
|
||||
- `version` - Show version
|
||||
- Environment-based configuration (TPDB_API_KEY)
|
||||
|
||||
#### TUI (Future: `internal/ui/tui`)
|
||||
- Terminal UI built with Bubble Tea
|
||||
- Features:
|
||||
- Browse performers/scenes/studios
|
||||
- View images (kitty/sixel protocol)
|
||||
- Interactive search
|
||||
|
||||
#### Daemon (Future: `cmd/goondexd`)
|
||||
- Background service
|
||||
- Scheduled scraping
|
||||
- Cache warming
|
||||
- Image pre-fetching
|
||||
|
||||
### 2. Application Layer
|
||||
|
||||
#### Commands (`cmd/goondex/*_cmd.go`)
|
||||
- Implements business logic for CLI commands
|
||||
- Orchestrates scraper + database operations
|
||||
- Handles error reporting and user feedback
|
||||
|
||||
#### Metadata Resolver (Future: `internal/metadata`)
|
||||
- Stash-inspired field merge strategies
|
||||
- Conflict resolution (IGNORE/MERGE/OVERWRITE)
|
||||
- De-duplication logic
|
||||
- Create-missing entity handling
|
||||
|
||||
#### Image Management (Future: `internal/images`)
|
||||
- Fetch images from URLs
|
||||
- Cache locally
|
||||
- Generate thumbnails/sprites
|
||||
- Byte-compare to avoid re-downloads
|
||||
|
||||
### 3. Scraper Layer
|
||||
|
||||
#### Scraper Interface (`internal/scraper/interface.go`)
|
||||
```go
|
||||
type Scraper interface {
|
||||
Name() string
|
||||
SearchPerformers(ctx, query) ([]Performer, error)
|
||||
SearchStudios(ctx, query) ([]Studio, error)
|
||||
SearchScenes(ctx, query) ([]Scene, error)
|
||||
GetPerformerByID(ctx, id) (*Performer, error)
|
||||
GetStudioByID(ctx, id) (*Studio, error)
|
||||
GetSceneByID(ctx, id) (*Scene, error)
|
||||
}
|
||||
```
|
||||
|
||||
#### TPDB Scraper (`internal/scraper/tpdb`)
|
||||
- HTTP client with Bearer auth
|
||||
- JSON response parsing
|
||||
- Mapping TPDB structs → internal models
|
||||
- Rate limiting (future)
|
||||
|
||||
#### Scraper Registry (`internal/scraper/registry.go`)
|
||||
- Manages available scrapers
|
||||
- Priority ordering
|
||||
- Fallback logic
|
||||
|
||||
### 4. Data Layer
|
||||
|
||||
#### Stores (`internal/db/*_store.go`)
|
||||
- Thin abstraction over SQL
|
||||
- CRUD operations
|
||||
- Search/filter methods
|
||||
- Transaction support (future)
|
||||
|
||||
#### Models (`internal/model/*.go`)
|
||||
- Pure Go structs
|
||||
- JSON serialization
|
||||
- No database coupling
|
||||
- Relationship fields
|
||||
|
||||
### 5. Storage Layer
|
||||
|
||||
#### SQLite Database
|
||||
- **Mode**: WAL (Write-Ahead Logging)
|
||||
- **Foreign Keys**: Enabled
|
||||
- **Indexes**: On name/title/code fields
|
||||
- **Location**: Configurable (default: `./goondex.db`)
|
||||
|
||||
## Data Flow Examples
|
||||
|
||||
### Import Flow
|
||||
|
||||
```
|
||||
User runs: goondex import performer "Riley Reid"
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 1. Parse command arguments │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 2. Load TPDB_API_KEY from env │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 3. Create TPDB scraper │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 4. Call SearchPerformers() │
|
||||
│ - Build HTTP request │
|
||||
│ - Add Bearer token │
|
||||
│ - Parse JSON response │
|
||||
│ - Map to model.Performer │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 5. Open local database │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 6. For each performer: │
|
||||
│ - PerformerStore.Create() │
|
||||
│ - Insert into DB │
|
||||
│ - Report success/failure │
|
||||
└─────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Search Flow
|
||||
|
||||
```
|
||||
User runs: goondex performer-search "Riley"
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 1. Parse command arguments │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 2. Open local database │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 3. PerformerStore.Search() │
|
||||
│ - Execute SQL LIKE query │
|
||||
│ - Parse rows into structs │
|
||||
└─────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────┐
|
||||
│ 4. Format and display results │
|
||||
└─────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Design Principles
|
||||
|
||||
### 1. **Local-First**
|
||||
- All data stored locally in SQLite
|
||||
- No required internet connection for search
|
||||
- External APIs only for imports/updates
|
||||
|
||||
### 2. **Pluggable Scrapers**
|
||||
- Standard interface for all sources
|
||||
- Easy to add new scrapers
|
||||
- Registry-based discovery
|
||||
|
||||
### 3. **Thin Database Layer**
|
||||
- Stores are lightweight wrappers
|
||||
- SQL is visible and auditable
|
||||
- No heavy ORM
|
||||
|
||||
### 4. **Clean Models**
|
||||
- Domain models are pure Go structs
|
||||
- No tags/annotations beyond JSON
|
||||
- Clear separation from DB layer
|
||||
|
||||
### 5. **Idempotent Operations**
|
||||
- Re-running imports is safe
|
||||
- Merge strategies prevent data loss
|
||||
- Duplicate detection
|
||||
|
||||
## Technology Stack
|
||||
|
||||
- **Language**: Go 1.25+
|
||||
- **Database**: SQLite (modernc.org/sqlite)
|
||||
- **CLI Framework**: Cobra
|
||||
- **HTTP Client**: net/http (stdlib)
|
||||
- **JSON**: encoding/json (stdlib)
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
Goondex/
|
||||
├── cmd/
|
||||
│ ├── goondex/ # CLI application
|
||||
│ └── goondexd/ # Daemon (future)
|
||||
├── internal/
|
||||
│ ├── db/ # Database layer
|
||||
│ ├── images/ # Image management
|
||||
│ ├── metadata/ # Resolver/strategies
|
||||
│ ├── model/ # Data models
|
||||
│ ├── scraper/ # Scraper system
|
||||
│ ├── ui/ # TUI/GUI
|
||||
│ └── util/ # Utilities
|
||||
├── config/ # YAML configs
|
||||
├── docs/ # Documentation
|
||||
├── scripts/ # Dev scripts
|
||||
└── assets/ # Static assets
|
||||
```
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### v0.2.x
|
||||
- Metadata resolver with field strategies
|
||||
- Image caching system
|
||||
- Full-text search (FTS5)
|
||||
- TUI browser
|
||||
|
||||
### v0.3.x
|
||||
- Background daemon
|
||||
- Scheduled imports
|
||||
- Duplicate detection
|
||||
- Preview sprites
|
||||
|
||||
### v0.4.x
|
||||
- Web UI
|
||||
- REST API
|
||||
- Multi-source priority
|
||||
- Plugin system
|
||||
365
docs/CLI_REFERENCE.md
Normal file
365
docs/CLI_REFERENCE.md
Normal file
|
|
@ -0,0 +1,365 @@
|
|||
# CLI Reference
|
||||
|
||||
Complete command-line interface documentation for Goondex.
|
||||
|
||||
## Global Flags
|
||||
|
||||
```bash
|
||||
--db string Path to SQLite database (default "./goondex.db")
|
||||
```
|
||||
|
||||
## Commands
|
||||
|
||||
### `goondex version`
|
||||
|
||||
Print version information.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex version
|
||||
```
|
||||
|
||||
**Output**:
|
||||
```
|
||||
Goondex v0.1.0-dev2
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `goondex performer-search`
|
||||
|
||||
Search for performers in the local database.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex performer-search [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term (searches name and aliases)
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Search for performers matching "Riley"
|
||||
goondex performer-search "Riley"
|
||||
|
||||
# Search with wildcards (SQL LIKE syntax)
|
||||
goondex performer-search "Riley%"
|
||||
```
|
||||
|
||||
**Output Format**:
|
||||
```
|
||||
Found 3 performer(s):
|
||||
|
||||
ID: 1
|
||||
Name: Riley Reid
|
||||
Aliases: Paige Riley
|
||||
Country: United States
|
||||
Gender: female
|
||||
---
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `goondex studio-search`
|
||||
|
||||
Search for studios in the local database.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex studio-search [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term (searches studio name)
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Search for studios matching "Brazzers"
|
||||
goondex studio-search "Brazzers"
|
||||
```
|
||||
|
||||
**Output Format**:
|
||||
```
|
||||
Found 2 studio(s):
|
||||
|
||||
ID: 1
|
||||
Name: Brazzers
|
||||
Description: Premium adult entertainment network
|
||||
---
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `goondex scene-search`
|
||||
|
||||
Search for scenes in the local database.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex scene-search [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term (searches title and code)
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Search for scenes matching "Big Wet Butts"
|
||||
goondex scene-search "Big Wet Butts"
|
||||
|
||||
# Search by DVD code
|
||||
goondex scene-search "BWB-2024-01"
|
||||
```
|
||||
|
||||
**Output Format**:
|
||||
```
|
||||
Found 5 scene(s):
|
||||
|
||||
ID: 1
|
||||
Title: Big Wet Butts 24
|
||||
Code: BWB-024
|
||||
Date: 2024-01-15
|
||||
Description: The hottest scenes featuring...
|
||||
---
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Import Commands
|
||||
|
||||
### `goondex import performer`
|
||||
|
||||
Search ThePornDB for performers and import them to the local database.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex import performer [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term for TPDB
|
||||
|
||||
**Environment Variables**:
|
||||
- `TPDB_API_KEY` - **Required**. Your ThePornDB API key
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Set API key
|
||||
export TPDB_API_KEY="your-api-key-here"
|
||||
|
||||
# Import performers matching "Riley Reid"
|
||||
goondex import performer "Riley Reid"
|
||||
```
|
||||
|
||||
**Output**:
|
||||
```
|
||||
Searching TPDB for performers matching 'Riley Reid'...
|
||||
Found 1 performer(s) on TPDB
|
||||
|
||||
Importing: Riley Reid (TPDB ID: 12345)
|
||||
✓ Imported with local ID: 1
|
||||
|
||||
✓ Successfully imported 1/1 performers
|
||||
```
|
||||
|
||||
**Error Handling**:
|
||||
- If `TPDB_API_KEY` is not set: Error message
|
||||
- If no results found: "No performers found on TPDB"
|
||||
- If import fails: Warning with error details, continues to next
|
||||
|
||||
---
|
||||
|
||||
### `goondex import studio`
|
||||
|
||||
Search ThePornDB for studios and import them to the local database.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex import studio [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term for TPDB
|
||||
|
||||
**Environment Variables**:
|
||||
- `TPDB_API_KEY` - **Required**
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Import studios matching "Brazzers"
|
||||
goondex import studio "Brazzers"
|
||||
```
|
||||
|
||||
**Output**:
|
||||
```
|
||||
Searching TPDB for studios matching 'Brazzers'...
|
||||
Found 2 studio(s) on TPDB
|
||||
|
||||
Importing: Brazzers (TPDB ID: 100)
|
||||
✓ Imported with local ID: 1
|
||||
Importing: Brazzers Network (TPDB ID: 101)
|
||||
✓ Imported with local ID: 2
|
||||
|
||||
✓ Successfully imported 2/2 studios
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `goondex import scene`
|
||||
|
||||
Search ThePornDB for scenes and import them to the local database.
|
||||
|
||||
This command also imports related performers, studios, and tags.
|
||||
|
||||
**Usage**:
|
||||
```bash
|
||||
goondex import scene [query]
|
||||
```
|
||||
|
||||
**Arguments**:
|
||||
- `query` - Search term for TPDB
|
||||
|
||||
**Environment Variables**:
|
||||
- `TPDB_API_KEY` - **Required**
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# Import scenes matching "Big Wet Butts"
|
||||
goondex import scene "Big Wet Butts"
|
||||
```
|
||||
|
||||
**Output**:
|
||||
```
|
||||
Searching TPDB for scenes matching 'Big Wet Butts'...
|
||||
Found 3 scene(s) on TPDB
|
||||
|
||||
Importing: Big Wet Butts 24 (TPDB ID: 54321)
|
||||
✓ Imported with local ID: 1
|
||||
|
||||
✓ Successfully imported 3/3 scenes
|
||||
```
|
||||
|
||||
**Automatic Imports**:
|
||||
When importing a scene, Goondex automatically:
|
||||
1. Imports the studio (if not already present)
|
||||
2. Imports all performers (if not already present)
|
||||
3. Imports all tags (if not already present)
|
||||
4. Links performers and tags to the scene
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
### Database Location
|
||||
|
||||
Override the default database location:
|
||||
|
||||
```bash
|
||||
goondex --db /path/to/custom.db performer-search "Riley"
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
| Variable | Required | Default | Description |
|
||||
|----------------|----------|---------|----------------------------|
|
||||
| TPDB_API_KEY | Yes* | - | ThePornDB API key |
|
||||
|
||||
*Required only for import commands
|
||||
|
||||
### Getting a TPDB API Key
|
||||
|
||||
1. Register at https://theporndb.net/register
|
||||
2. Navigate to https://theporndb.net/user/api-tokens
|
||||
3. Generate a new API token
|
||||
4. Export it: `export TPDB_API_KEY="your-key"`
|
||||
|
||||
---
|
||||
|
||||
## Exit Codes
|
||||
|
||||
| Code | Meaning |
|
||||
|------|----------------------------------|
|
||||
| 0 | Success |
|
||||
| 1 | General error |
|
||||
|
||||
---
|
||||
|
||||
## Examples
|
||||
|
||||
### Complete Workflow
|
||||
|
||||
```bash
|
||||
# 1. Set up API key
|
||||
export TPDB_API_KEY="DNdmkEPlUTWWDuT98jYcxCIfAxReDrHJ55PyCGru496ae577"
|
||||
|
||||
# 2. Import a performer
|
||||
goondex import performer "Riley Reid"
|
||||
|
||||
# 3. Import a studio
|
||||
goondex import studio "Brazzers"
|
||||
|
||||
# 4. Import scenes
|
||||
goondex import scene "Big Wet Butts"
|
||||
|
||||
# 5. Search local database
|
||||
goondex performer-search "Riley"
|
||||
goondex studio-search "Brazzers"
|
||||
goondex scene-search "Big Wet"
|
||||
```
|
||||
|
||||
### Custom Database Location
|
||||
|
||||
```bash
|
||||
# Use a different database
|
||||
export GOONDEX_DB="/mnt/storage/media.db"
|
||||
goondex --db "$GOONDEX_DB" import performer "Riley Reid"
|
||||
goondex --db "$GOONDEX_DB" performer-search "Riley"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "TPDB_API_KEY environment variable is not set"
|
||||
|
||||
**Solution**: Export your API key before running import commands:
|
||||
```bash
|
||||
export TPDB_API_KEY="your-key-here"
|
||||
```
|
||||
|
||||
### "No performers found on TPDB"
|
||||
|
||||
**Possible Causes**:
|
||||
1. Typo in search query
|
||||
2. TPDB doesn't have that performer
|
||||
3. API rate limiting (wait a minute and retry)
|
||||
|
||||
### "API returned status 401"
|
||||
|
||||
**Cause**: Invalid or expired API key
|
||||
|
||||
**Solution**:
|
||||
1. Verify your API key at https://theporndb.net/user/api-tokens
|
||||
2. Generate a new key if needed
|
||||
3. Update your environment variable
|
||||
|
||||
### "API returned status 429"
|
||||
|
||||
**Cause**: Rate limiting
|
||||
|
||||
**Solution**: Wait 60 seconds and retry
|
||||
|
||||
---
|
||||
|
||||
## Future Commands (Planned)
|
||||
|
||||
### v0.2.x
|
||||
- `goondex identify --scene-id <id>` - Match local scene to TPDB
|
||||
- `goondex sync` - Sync all data from TPDB
|
||||
- `goondex tui` - Launch interactive browser
|
||||
|
||||
### v0.3.x
|
||||
- `goondex daemon start` - Start background service
|
||||
- `goondex daemon stop` - Stop background service
|
||||
- `goondex daemon status` - Check daemon status
|
||||
304
docs/DATABASE_SCHEMA.md
Normal file
304
docs/DATABASE_SCHEMA.md
Normal file
|
|
@ -0,0 +1,304 @@
|
|||
# Database Schema
|
||||
|
||||
## Overview
|
||||
|
||||
Goondex uses SQLite as its database engine with the following characteristics:
|
||||
- **Journal Mode**: WAL (Write-Ahead Logging)
|
||||
- **Foreign Keys**: Enabled
|
||||
- **Date Format**: RFC3339 (ISO 8601)
|
||||
- **Location**: Configurable (default: `./goondex.db`)
|
||||
|
||||
## Schema Diagram
|
||||
|
||||
```
|
||||
┌──────────────┐ ┌──────────────────┐ ┌──────────────┐
|
||||
│ performers │ │ scene_performers │ │ scenes │
|
||||
├──────────────┤ ├──────────────────┤ ├──────────────┤
|
||||
│ id (PK) │ │ scene_id (FK) │ │ id (PK) │
|
||||
│ name │◄──────┤ performer_id(FK) ├──────►│ title │
|
||||
│ aliases │ └──────────────────┘ │ code │
|
||||
│ nationality │ │ date │
|
||||
│ country │ │ studio_id(FK)│
|
||||
│ gender │ │ description │
|
||||
│ image_path │ │ image_path │
|
||||
│ image_url │ │ image_url │
|
||||
│ bio │ │ director │
|
||||
│ source │ │ url │
|
||||
│ source_id │ │ source │
|
||||
│ created_at │ │ source_id │
|
||||
│ updated_at │ │ created_at │
|
||||
└──────────────┘ │ updated_at │
|
||||
└──────────────┘
|
||||
│
|
||||
│
|
||||
┌──────────────────┐ │
|
||||
│ scene_tags │ │
|
||||
├──────────────────┤ │
|
||||
│ scene_id (FK) │◄─────────────────┘
|
||||
│ tag_id (FK) ├──────┐
|
||||
└──────────────────┘ │
|
||||
│
|
||||
┌──────────────┐ │
|
||||
│ studios │ │
|
||||
├──────────────┤ │
|
||||
│ id (PK) │ │
|
||||
│ name │ │
|
||||
│ parent_id(FK)│────┐ │
|
||||
│ image_path │ │(self-ref) │
|
||||
│ image_url │ │ │
|
||||
│ description │ │ │
|
||||
│ source │ │ ▼
|
||||
│ source_id │ │ ┌──────────────┐
|
||||
│ created_at │ │ │ tags │
|
||||
│ updated_at │ │ ├──────────────┤
|
||||
└──────────────┘ │ │ id (PK) │
|
||||
▲ │ │ name (UNIQUE)│
|
||||
│ │ │ source │
|
||||
│ │ │ source_id │
|
||||
└───────────┘ │ created_at │
|
||||
│ updated_at │
|
||||
└──────────────┘
|
||||
```
|
||||
|
||||
## Table Definitions
|
||||
|
||||
### `performers`
|
||||
|
||||
Stores performer/actor information.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|--------------|---------|-----------------------|--------------------------------|
|
||||
| id | INTEGER | PRIMARY KEY AUTOINCR | Local database ID |
|
||||
| name | TEXT | NOT NULL | Performer's name |
|
||||
| aliases | TEXT | | Comma-separated aliases |
|
||||
| nationality | TEXT | | ISO country code (e.g., "US") |
|
||||
| country | TEXT | | Full country name |
|
||||
| gender | TEXT | | male/female/trans/other |
|
||||
| image_path | TEXT | | Local file path to image |
|
||||
| image_url | TEXT | | Remote URL to image |
|
||||
| bio | TEXT | | Biography/description |
|
||||
| source | TEXT | | Source name (e.g., "tpdb") |
|
||||
| source_id | TEXT | | ID at the source |
|
||||
| created_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
| updated_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
|
||||
**Indexes**:
|
||||
- `idx_performers_name` on `name`
|
||||
|
||||
### `studios`
|
||||
|
||||
Stores production companies and networks.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|--------------|---------|-----------------------|--------------------------------|
|
||||
| id | INTEGER | PRIMARY KEY AUTOINCR | Local database ID |
|
||||
| name | TEXT | NOT NULL | Studio/site name |
|
||||
| parent_id | INTEGER | FK → studios(id) | Parent studio/network ID |
|
||||
| image_path | TEXT | | Local file path to logo |
|
||||
| image_url | TEXT | | Remote URL to logo |
|
||||
| description | TEXT | | Studio description |
|
||||
| source | TEXT | | Source name |
|
||||
| source_id | TEXT | | ID at the source |
|
||||
| created_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
| updated_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
|
||||
**Indexes**:
|
||||
- `idx_studios_name` on `name`
|
||||
|
||||
**Foreign Keys**:
|
||||
- `parent_id` references `studios(id)` ON DELETE SET NULL
|
||||
|
||||
### `tags`
|
||||
|
||||
Stores content tags and categories.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|--------------|---------|-----------------------|--------------------------------|
|
||||
| id | INTEGER | PRIMARY KEY AUTOINCR | Local database ID |
|
||||
| name | TEXT | NOT NULL, UNIQUE | Tag name |
|
||||
| source | TEXT | | Source name |
|
||||
| source_id | TEXT | | ID at the source |
|
||||
| created_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
| updated_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
|
||||
**Indexes**:
|
||||
- `idx_tags_name` on `name`
|
||||
|
||||
### `scenes`
|
||||
|
||||
Stores video scenes/titles.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|--------------|---------|-----------------------|--------------------------------|
|
||||
| id | INTEGER | PRIMARY KEY AUTOINCR | Local database ID |
|
||||
| title | TEXT | NOT NULL | Scene title |
|
||||
| code | TEXT | | DVD code, scene ID, etc. |
|
||||
| date | TEXT | | Release date (ISO 8601) |
|
||||
| studio_id | INTEGER | FK → studios(id) | Studio ID |
|
||||
| description | TEXT | | Scene description/synopsis |
|
||||
| image_path | TEXT | | Local file path to cover |
|
||||
| image_url | TEXT | | Remote URL to cover |
|
||||
| director | TEXT | | Director name |
|
||||
| url | TEXT | | Scene URL |
|
||||
| source | TEXT | | Source name |
|
||||
| source_id | TEXT | | ID at the source |
|
||||
| created_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
| updated_at | TEXT | NOT NULL, DEFAULT NOW | RFC3339 timestamp |
|
||||
|
||||
**Indexes**:
|
||||
- `idx_scenes_title` on `title`
|
||||
- `idx_scenes_code` on `code`
|
||||
|
||||
**Foreign Keys**:
|
||||
- `studio_id` references `studios(id)` ON DELETE SET NULL
|
||||
|
||||
### `scene_performers`
|
||||
|
||||
Junction table for many-to-many relationship between scenes and performers.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|---------------|---------|----------------------------|----------------------|
|
||||
| scene_id | INTEGER | FK → scenes(id) | Scene ID |
|
||||
| performer_id | INTEGER | FK → performers(id) | Performer ID |
|
||||
|
||||
**Primary Key**: `(scene_id, performer_id)`
|
||||
|
||||
**Foreign Keys**:
|
||||
- `scene_id` references `scenes(id)` ON DELETE CASCADE
|
||||
- `performer_id` references `performers(id)` ON DELETE CASCADE
|
||||
|
||||
### `scene_tags`
|
||||
|
||||
Junction table for many-to-many relationship between scenes and tags.
|
||||
|
||||
| Column | Type | Constraints | Description |
|
||||
|------------|---------|----------------------------|----------------------|
|
||||
| scene_id | INTEGER | FK → scenes(id) | Scene ID |
|
||||
| tag_id | INTEGER | FK → tags(id) | Tag ID |
|
||||
|
||||
**Primary Key**: `(scene_id, tag_id)`
|
||||
|
||||
**Foreign Keys**:
|
||||
- `scene_id` references `scenes(id)` ON DELETE CASCADE
|
||||
- `tag_id` references `tags(id)` ON DELETE CASCADE
|
||||
|
||||
## Common Queries
|
||||
|
||||
### Search Performers
|
||||
|
||||
```sql
|
||||
SELECT * FROM performers
|
||||
WHERE name LIKE '%Riley%' OR aliases LIKE '%Riley%'
|
||||
ORDER BY name;
|
||||
```
|
||||
|
||||
### Get Scene with All Related Data
|
||||
|
||||
```sql
|
||||
-- Get scene
|
||||
SELECT * FROM scenes WHERE id = ?;
|
||||
|
||||
-- Get performers for scene
|
||||
SELECT p.* FROM performers p
|
||||
JOIN scene_performers sp ON p.id = sp.performer_id
|
||||
WHERE sp.scene_id = ?;
|
||||
|
||||
-- Get tags for scene
|
||||
SELECT t.* FROM tags t
|
||||
JOIN scene_tags st ON t.id = st.tag_id
|
||||
WHERE st.scene_id = ?;
|
||||
|
||||
-- Get studio for scene
|
||||
SELECT s.* FROM studios s
|
||||
JOIN scenes sc ON s.id = sc.studio_id
|
||||
WHERE sc.id = ?;
|
||||
```
|
||||
|
||||
### Find Scenes by Performer
|
||||
|
||||
```sql
|
||||
SELECT s.* FROM scenes s
|
||||
JOIN scene_performers sp ON s.id = sp.scene_id
|
||||
JOIN performers p ON sp.performer_id = p.id
|
||||
WHERE p.name LIKE '%Riley Reid%'
|
||||
ORDER BY s.date DESC;
|
||||
```
|
||||
|
||||
### Get Studio Hierarchy
|
||||
|
||||
```sql
|
||||
-- Get parent studios
|
||||
WITH RECURSIVE studio_tree AS (
|
||||
SELECT id, name, parent_id, 0 AS level
|
||||
FROM studios
|
||||
WHERE id = ?
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT s.id, s.name, s.parent_id, st.level + 1
|
||||
FROM studios s
|
||||
JOIN studio_tree st ON s.parent_id = st.id
|
||||
)
|
||||
SELECT * FROM studio_tree;
|
||||
```
|
||||
|
||||
## Migration Strategy
|
||||
|
||||
For future schema changes:
|
||||
1. Version tracking in a `schema_version` table
|
||||
2. Migration scripts in `internal/db/migrations/`
|
||||
3. Up/down migration support
|
||||
4. Automatic migration on startup
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Current Indexes
|
||||
- Name fields (performers, studios, tags, scenes)
|
||||
- Code field (scenes)
|
||||
|
||||
### Future Optimizations (v0.2+)
|
||||
- Full-text search indexes (FTS5)
|
||||
- Composite indexes for common queries
|
||||
- Materialized views for complex joins
|
||||
- Separate alias table for normalized storage
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
### Backup
|
||||
```bash
|
||||
# SQLite backup
|
||||
sqlite3 goondex.db ".backup goondex-backup.db"
|
||||
|
||||
# Or simple copy (safe with WAL mode)
|
||||
cp goondex.db goondex-backup.db
|
||||
```
|
||||
|
||||
### Restore
|
||||
```bash
|
||||
# Copy backup over current
|
||||
cp goondex-backup.db goondex.db
|
||||
```
|
||||
|
||||
## Database Maintenance
|
||||
|
||||
### Analyze Statistics
|
||||
```sql
|
||||
ANALYZE;
|
||||
```
|
||||
|
||||
### Vacuum (Rebuild)
|
||||
```sql
|
||||
VACUUM;
|
||||
```
|
||||
|
||||
### Check Integrity
|
||||
```sql
|
||||
PRAGMA integrity_check;
|
||||
```
|
||||
|
||||
### View Database Info
|
||||
```sql
|
||||
PRAGMA database_list;
|
||||
PRAGMA table_list;
|
||||
PRAGMA foreign_key_check;
|
||||
```
|
||||
74
docs/INDEX.md
Normal file
74
docs/INDEX.md
Normal file
|
|
@ -0,0 +1,74 @@
|
|||
# Goondex Documentation
|
||||
|
||||
**Version**: v0.1.0-dev2
|
||||
**Last Updated**: 2025-11-14
|
||||
|
||||
## Welcome to Goondex
|
||||
|
||||
Goondex is a fast, local-first media indexer for adult content. It ingests metadata from external sources (like ThePornDB), normalizes it, and stores it in a lightweight SQLite database for quick search and retrieval.
|
||||
|
||||
## Documentation Structure
|
||||
|
||||
### Getting Started
|
||||
- [Installation Guide](INSTALLATION.md) - Setup and installation
|
||||
- [Quick Start](QUICKSTART.md) - Get up and running in 5 minutes
|
||||
- [CLI Reference](CLI_REFERENCE.md) - Complete command-line interface documentation
|
||||
|
||||
### Core Concepts
|
||||
- [Architecture Overview](ARCHITECTURE.md) - System design and components
|
||||
- [Database Schema](DATABASE_SCHEMA.md) - SQLite database structure
|
||||
- [Data Models](DATA_MODELS.md) - Internal data structures
|
||||
|
||||
### Integration
|
||||
- [TPDB Integration](TPDB_INTEGRATION.md) - ThePornDB API integration guide
|
||||
- [Scraper System](SCRAPER_SYSTEM.md) - How scrapers work
|
||||
- [Adding New Sources](ADDING_SOURCES.md) - Implementing new data sources
|
||||
|
||||
### Development
|
||||
- [Development Guide](DEVELOPMENT.md) - Setting up a development environment
|
||||
- [Contributing](CONTRIBUTING.md) - How to contribute to Goondex
|
||||
- [API Design](API_DESIGN.md) - Internal API design principles
|
||||
- [Testing](TESTING.md) - Testing strategies and tools
|
||||
|
||||
### Configuration
|
||||
- [Configuration Reference](CONFIGURATION.md) - All configuration options
|
||||
- [Metadata Strategies](METADATA_STRATEGIES.md) - Field merge strategies (Stash-inspired)
|
||||
|
||||
### Roadmap
|
||||
- [Roadmap](ROADMAP.md) - Future plans and versioning
|
||||
- [Changelog](CHANGELOG.md) - Version history and changes
|
||||
|
||||
## Key Features
|
||||
|
||||
### Current (v0.1.0-dev2)
|
||||
- ✅ SQLite database with WAL mode
|
||||
- ✅ Full CRUD operations for performers, studios, scenes, and tags
|
||||
- ✅ TPDB scraper with real API integration
|
||||
- ✅ CLI search commands (local database)
|
||||
- ✅ CLI import commands (fetch from TPDB)
|
||||
- ✅ Relationship management (scenes ↔ performers, scenes ↔ tags)
|
||||
|
||||
### Planned
|
||||
- ⏳ Stash-style metadata resolution (field strategies)
|
||||
- ⏳ Image caching and management
|
||||
- ⏳ TUI (Terminal UI) browser
|
||||
- ⏳ Background daemon (goondexd)
|
||||
- ⏳ Web UI
|
||||
|
||||
## Quick Links
|
||||
|
||||
- **GitHub**: [Your Repo URL]
|
||||
- **Issue Tracker**: [Your Repo URL]/issues
|
||||
- **ThePornDB**: https://theporndb.net
|
||||
- **API Documentation**: https://api.theporndb.net/docs
|
||||
|
||||
## Support
|
||||
|
||||
For help and support:
|
||||
1. Check the [Troubleshooting Guide](TROUBLESHOOTING.md)
|
||||
2. Search [existing issues](issues)
|
||||
3. Create a new issue if needed
|
||||
|
||||
## License
|
||||
|
||||
[Your License]
|
||||
331
docs/TPDB_INTEGRATION.md
Normal file
331
docs/TPDB_INTEGRATION.md
Normal file
|
|
@ -0,0 +1,331 @@
|
|||
# ThePornDB Integration Guide
|
||||
|
||||
## Overview
|
||||
|
||||
Goondex integrates with [ThePornDB](https://theporndb.net) (TPDB) to fetch high-quality metadata for performers, studios, and scenes. TPDB is a community-driven database similar to IMDB but for adult content.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### 1. TPDB Account
|
||||
|
||||
1. Register at https://theporndb.net/register
|
||||
2. Verify your email
|
||||
3. Log in to your account
|
||||
|
||||
### 2. API Token
|
||||
|
||||
1. Navigate to https://theporndb.net/user/api-tokens
|
||||
2. Click "Create New Token"
|
||||
3. Give it a name (e.g., "Goondex")
|
||||
4. Copy the generated token
|
||||
5. **Save it securely** - it won't be shown again
|
||||
|
||||
### 3. Set Environment Variable
|
||||
|
||||
```bash
|
||||
# Linux/macOS (add to ~/.bashrc or ~/.zshrc for persistence)
|
||||
export TPDB_API_KEY="your-api-key-here"
|
||||
|
||||
# Windows (PowerShell)
|
||||
$env:TPDB_API_KEY="your-api-key-here"
|
||||
|
||||
# Windows (CMD)
|
||||
set TPDB_API_KEY=your-api-key-here
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
Goondex uses the following TPDB API endpoints:
|
||||
|
||||
| Endpoint | Method | Purpose |
|
||||
|----------|--------|---------|
|
||||
| `/performers` | GET | Search for performers |
|
||||
| `/performers/{id}` | GET | Get performer by ID |
|
||||
| `/sites` | GET | Search for studios/sites |
|
||||
| `/sites/{id}` | GET | Get studio by ID |
|
||||
| `/scenes` | GET | Search for scenes |
|
||||
| `/scenes/{id}` | GET | Get scene by ID |
|
||||
|
||||
**Base URL**: `https://api.theporndb.net`
|
||||
|
||||
**Authentication**: Bearer token in `Authorization` header
|
||||
|
||||
## Data Mapping
|
||||
|
||||
### Performer Fields
|
||||
|
||||
| TPDB Field | Goondex Field | Notes |
|
||||
|------------|---------------|-------|
|
||||
| `id` | `source_id` | TPDB ID stored as source reference |
|
||||
| `name` | `name` | Primary name |
|
||||
| `aliases` | `aliases` | Comma-separated alternative names |
|
||||
| `nationality` | `nationality` + `country` | Both fields populated |
|
||||
| `gender` | `gender` | male/female/trans/other |
|
||||
| `image` | `image_url` | Poster image URL |
|
||||
| `bio` | `bio` | Biography text |
|
||||
|
||||
**Additional TPDB fields** (available but not currently stored):
|
||||
- `birthday`, `astrology`, `birthplace`
|
||||
- `ethnicity`, `eye_color`, `hair_color`
|
||||
- `height`, `weight`, `measurements`
|
||||
- `tattoo_description`, `piercing_description`
|
||||
- `boob_job`, `active`
|
||||
|
||||
### Studio Fields
|
||||
|
||||
| TPDB Field | Goondex Field | Notes |
|
||||
|------------|---------------|-------|
|
||||
| `id` | `source_id` | TPDB ID |
|
||||
| `name` | `name` | Studio/site name |
|
||||
| `description` | `description` | About the studio |
|
||||
| `logo` | `image_url` | Logo image URL |
|
||||
| `parent.id` | `parent_id` | Parent network ID (not yet implemented) |
|
||||
| `url` | - | Currently stored in description |
|
||||
|
||||
### Scene Fields
|
||||
|
||||
| TPDB Field | Goondex Field | Notes |
|
||||
|------------|---------------|-------|
|
||||
| `id` | `source_id` | TPDB ID |
|
||||
| `uuid` | - | Not stored |
|
||||
| `title` | `title` | Scene title |
|
||||
| `description` | `description` | Scene synopsis |
|
||||
| `url` | `url` | Scene URL |
|
||||
| `date` | `date` | Release date (ISO 8601) |
|
||||
| `image` | `image_url` | Cover image URL |
|
||||
| `director` | `director` | Director name |
|
||||
| `code` | `code` | DVD code or scene identifier |
|
||||
| `site.id` | `studio_id` | References studios table |
|
||||
| `performers[]` | Scene-performer relationship | Many-to-many |
|
||||
| `tags[]` | Scene-tag relationship | Many-to-many |
|
||||
|
||||
**Additional TPDB fields** (available but not currently stored):
|
||||
- `poster`, `duration`
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### Import a Performer
|
||||
|
||||
```bash
|
||||
# Search and import Riley Reid
|
||||
export TPDB_API_KEY="your-key"
|
||||
goondex import performer "Riley Reid"
|
||||
```
|
||||
|
||||
**What happens**:
|
||||
1. Goondex queries TPDB API: `GET /performers?q=Riley+Reid`
|
||||
2. TPDB returns matching performers (usually 1 exact match)
|
||||
3. Goondex maps TPDB fields to internal model
|
||||
4. Data is inserted into local SQLite database
|
||||
5. Local ID is assigned and reported
|
||||
|
||||
### Import a Studio
|
||||
|
||||
```bash
|
||||
# Import Brazzers
|
||||
goondex import studio "Brazzers"
|
||||
```
|
||||
|
||||
**What happens**:
|
||||
1. Query: `GET /sites?q=Brazzers`
|
||||
2. TPDB returns matching sites
|
||||
3. Studios are created in local database
|
||||
|
||||
### Import a Scene
|
||||
|
||||
```bash
|
||||
# Import a scene
|
||||
goondex import scene "Big Wet Butts 24"
|
||||
```
|
||||
|
||||
**What happens**:
|
||||
1. Query: `GET /scenes?q=Big+Wet+Butts+24`
|
||||
2. TPDB returns matching scenes with embedded performers, studio, and tags
|
||||
3. **Studio** is imported/updated first
|
||||
4. **Scene** is created with reference to studio
|
||||
5. **Performers** are imported/updated
|
||||
6. **Tags** are imported/updated
|
||||
7. **Relationships** are created in junction tables
|
||||
|
||||
## API Rate Limiting
|
||||
|
||||
TPDB implements rate limiting to prevent abuse.
|
||||
|
||||
### Current Limits (as of 2024)
|
||||
- **Requests per minute**: ~60
|
||||
- **Requests per hour**: ~1000
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Batch imports**: Import multiple items at once instead of one-by-one
|
||||
2. **Cache locally**: Don't re-import already imported items
|
||||
3. **Error handling**: Implement exponential backoff on 429 errors
|
||||
|
||||
### Handling Rate Limits
|
||||
|
||||
If you receive a `429 Too Many Requests` response:
|
||||
|
||||
```bash
|
||||
# Wait 60 seconds
|
||||
sleep 60
|
||||
|
||||
# Retry the command
|
||||
goondex import performer "Riley Reid"
|
||||
```
|
||||
|
||||
## Response Format
|
||||
|
||||
### Standard Response Wrapper
|
||||
|
||||
```json
|
||||
{
|
||||
"data": { ... }, // Single object or array
|
||||
"meta": { // Optional pagination
|
||||
"current_page": 1,
|
||||
"from": 1,
|
||||
"last_page": 5,
|
||||
"per_page": 25,
|
||||
"to": 25,
|
||||
"total": 100
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Performer Response Example
|
||||
|
||||
```json
|
||||
{
|
||||
"data": {
|
||||
"id": "12345",
|
||||
"name": "Riley Reid",
|
||||
"slug": "riley-reid",
|
||||
"gender": "female",
|
||||
"aliases": "Paige Riley",
|
||||
"nationality": "US",
|
||||
"image": "https://cdn.theporndb.net/performers/riley-reid.jpg",
|
||||
"bio": "Riley Reid is an American adult film actress..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Scene Response Example
|
||||
|
||||
```json
|
||||
{
|
||||
"data": {
|
||||
"id": "54321",
|
||||
"uuid": "abc-def-123",
|
||||
"title": "Big Wet Butts 24",
|
||||
"date": "2024-01-15",
|
||||
"description": "Riley Reid stars in...",
|
||||
"code": "BWB-024",
|
||||
"image": "https://cdn.theporndb.net/scenes/bwb-024.jpg",
|
||||
"site": {
|
||||
"id": "100",
|
||||
"name": "Brazzers",
|
||||
"url": "https://brazzers.com"
|
||||
},
|
||||
"performers": [
|
||||
{
|
||||
"id": "12345",
|
||||
"name": "Riley Reid",
|
||||
"gender": "female"
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
{
|
||||
"id": "1",
|
||||
"name": "Anal",
|
||||
"slug": "anal"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Common HTTP Status Codes
|
||||
|
||||
| Code | Meaning | Solution |
|
||||
|------|---------|----------|
|
||||
| 200 | Success | No action needed |
|
||||
| 401 | Unauthorized | Check API key |
|
||||
| 404 | Not Found | Item doesn't exist in TPDB |
|
||||
| 429 | Too Many Requests | Wait and retry |
|
||||
| 500 | Server Error | TPDB issue, wait and retry |
|
||||
|
||||
### Goondex Error Messages
|
||||
|
||||
```bash
|
||||
# Missing API key
|
||||
Error: TPDB_API_KEY environment variable is not set
|
||||
|
||||
# API error
|
||||
Error: failed to search TPDB: API returned status 401
|
||||
|
||||
# No results
|
||||
No performers found on TPDB
|
||||
```
|
||||
|
||||
## Advanced Usage
|
||||
|
||||
### Searching by ID
|
||||
|
||||
While Goondex doesn't currently expose this via CLI, you can fetch by TPDB ID programmatically:
|
||||
|
||||
```go
|
||||
import "git.leaktechnologies.dev/stu/Goondex/internal/scraper/tpdb"
|
||||
|
||||
scraper := tpdb.NewScraper("https://api.theporndb.net", apiKey)
|
||||
performer, err := scraper.GetPerformerByID(ctx, "12345")
|
||||
```
|
||||
|
||||
### Pagination
|
||||
|
||||
TPDB returns paginated results. Goondex currently fetches only the first page (default 25 results). Future versions will support pagination.
|
||||
|
||||
## Data Quality
|
||||
|
||||
### TPDB Strengths
|
||||
- ✅ High-quality, curated performer data
|
||||
- ✅ Comprehensive scene metadata
|
||||
- ✅ Studio/network relationships
|
||||
- ✅ Active community maintenance
|
||||
- ✅ Regular updates
|
||||
|
||||
### Known Limitations
|
||||
- ⚠️ Not all performers have complete bio data
|
||||
- ⚠️ Some older scenes may have limited metadata
|
||||
- ⚠️ Parent studio relationships not always populated
|
||||
- ⚠️ Image URLs may expire or change
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### v0.2.x
|
||||
- **Incremental updates**: Re-import to update changed data
|
||||
- **Image caching**: Download and cache images locally
|
||||
- **Pagination support**: Fetch all pages of results
|
||||
|
||||
### v0.3.x
|
||||
- **Scheduled syncing**: Automatic daily/weekly updates
|
||||
- **Duplicate detection**: Merge duplicate entries
|
||||
- **Conflict resolution**: Handle data conflicts intelligently
|
||||
|
||||
### v0.4.x
|
||||
- **Multi-source priority**: Combine TPDB with other sources
|
||||
- **Manual overrides**: User-edited fields protected from updates
|
||||
|
||||
## API Documentation
|
||||
|
||||
Official TPDB API documentation:
|
||||
- https://api.theporndb.net/docs
|
||||
|
||||
## Support
|
||||
|
||||
For TPDB-related issues:
|
||||
- **TPDB Discord**: https://discord.gg/theporndb
|
||||
- **TPDB GitHub**: https://github.com/ThePornDatabase
|
||||
|
||||
For Goondex integration issues:
|
||||
- **Goondex Issues**: https://git.leaktechnologies.dev/stu/Goondex/issues
|
||||
155
internal/scraper/tpdb/mapper.go
Normal file
155
internal/scraper/tpdb/mapper.go
Normal file
|
|
@ -0,0 +1,155 @@
|
|||
package tpdb
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"git.leaktechnologies.dev/stu/Goondex/internal/model"
|
||||
)
|
||||
|
||||
// mapPerformer converts a TPDB performer to our internal model
|
||||
func mapPerformer(p PerformerResponse) model.Performer {
|
||||
performer := model.Performer{
|
||||
Name: p.Name,
|
||||
Source: "tpdb",
|
||||
SourceID: p.ID,
|
||||
}
|
||||
|
||||
// Map optional fields
|
||||
if len(p.Aliases) > 0 {
|
||||
performer.Aliases = strings.Join(p.Aliases, ", ")
|
||||
}
|
||||
|
||||
if p.Gender != "" {
|
||||
performer.Gender = p.Gender
|
||||
}
|
||||
|
||||
if p.Nationality != nil {
|
||||
performer.Country = *p.Nationality
|
||||
performer.Nationality = *p.Nationality
|
||||
}
|
||||
|
||||
if p.Image != nil {
|
||||
performer.ImageURL = *p.Image
|
||||
}
|
||||
|
||||
// Build bio from available information
|
||||
bio := ""
|
||||
if p.Bio != nil {
|
||||
bio = *p.Bio
|
||||
}
|
||||
performer.Bio = bio
|
||||
|
||||
return performer
|
||||
}
|
||||
|
||||
// mapStudio converts a TPDB studio to our internal model
|
||||
func mapStudio(s StudioResponse) model.Studio {
|
||||
studio := model.Studio{
|
||||
Name: s.Name,
|
||||
Source: "tpdb",
|
||||
SourceID: strconv.Itoa(s.ID),
|
||||
}
|
||||
|
||||
if s.Description != nil {
|
||||
studio.Description = *s.Description
|
||||
}
|
||||
|
||||
if s.Logo != nil {
|
||||
studio.ImageURL = *s.Logo
|
||||
}
|
||||
|
||||
// Handle parent studio
|
||||
if s.Parent != nil {
|
||||
// We'll need to look up or create the parent studio separately
|
||||
// For now, we'll store the parent ID as a string that needs to be resolved
|
||||
// This is a limitation that should be handled by the import logic
|
||||
}
|
||||
|
||||
return studio
|
||||
}
|
||||
|
||||
// mapScene converts a TPDB scene to our internal model
|
||||
func mapScene(s SceneResponse) model.Scene {
|
||||
scene := model.Scene{
|
||||
Title: s.Title,
|
||||
Source: "tpdb",
|
||||
SourceID: s.ID,
|
||||
}
|
||||
|
||||
if s.Description != nil {
|
||||
scene.Description = *s.Description
|
||||
}
|
||||
|
||||
if s.URL != nil {
|
||||
scene.URL = *s.URL
|
||||
}
|
||||
|
||||
if s.Date != nil {
|
||||
scene.Date = *s.Date
|
||||
}
|
||||
|
||||
if s.Image != nil {
|
||||
scene.ImageURL = *s.Image
|
||||
}
|
||||
|
||||
if s.Director != nil {
|
||||
scene.Director = *s.Director
|
||||
}
|
||||
|
||||
if s.Code != nil {
|
||||
scene.Code = *s.Code
|
||||
}
|
||||
|
||||
// Map performers
|
||||
if len(s.Performers) > 0 {
|
||||
performers := make([]model.Performer, 0, len(s.Performers))
|
||||
for _, p := range s.Performers {
|
||||
performer := model.Performer{
|
||||
Name: p.Name,
|
||||
Source: "tpdb",
|
||||
SourceID: p.ID,
|
||||
}
|
||||
if p.Gender != nil {
|
||||
performer.Gender = *p.Gender
|
||||
}
|
||||
performers = append(performers, performer)
|
||||
}
|
||||
scene.Performers = performers
|
||||
}
|
||||
|
||||
// Map tags
|
||||
if len(s.Tags) > 0 {
|
||||
tags := make([]model.Tag, 0, len(s.Tags))
|
||||
for _, t := range s.Tags {
|
||||
tag := model.Tag{
|
||||
Name: t.Name,
|
||||
Source: "tpdb",
|
||||
SourceID: t.ID,
|
||||
}
|
||||
tags = append(tags, tag)
|
||||
}
|
||||
scene.Tags = tags
|
||||
}
|
||||
|
||||
// Map studio
|
||||
if s.Site != nil {
|
||||
studio := model.Studio{
|
||||
Name: s.Site.Name,
|
||||
Source: "tpdb",
|
||||
SourceID: strconv.Itoa(s.Site.ID),
|
||||
}
|
||||
if s.Site.URL != nil {
|
||||
studio.Description = fmt.Sprintf("URL: %s", *s.Site.URL)
|
||||
}
|
||||
scene.Studio = &studio
|
||||
}
|
||||
|
||||
return scene
|
||||
}
|
||||
|
||||
// stringToInt64 safely converts a string to int64
|
||||
func stringToInt64(s string) (int64, error) {
|
||||
return strconv.ParseInt(s, 10, 64)
|
||||
}
|
||||
|
|
@ -2,6 +2,9 @@ package tpdb
|
|||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/url"
|
||||
|
||||
"git.leaktechnologies.dev/stu/Goondex/internal/model"
|
||||
)
|
||||
|
|
@ -25,36 +28,147 @@ func (s *Scraper) Name() string {
|
|||
|
||||
// SearchPerformers searches for performers by query string
|
||||
func (s *Scraper) SearchPerformers(ctx context.Context, query string) ([]model.Performer, error) {
|
||||
// TODO: Implement TPDB performer search
|
||||
return nil, nil
|
||||
params := url.Values{}
|
||||
params.Set("q", query)
|
||||
|
||||
body, err := s.client.get(ctx, "/performers", params)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to search performers: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbPerformers []PerformerResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbPerformers); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse performers: %w", err)
|
||||
}
|
||||
|
||||
performers := make([]model.Performer, 0, len(tpdbPerformers))
|
||||
for _, p := range tpdbPerformers {
|
||||
performers = append(performers, mapPerformer(p))
|
||||
}
|
||||
|
||||
return performers, nil
|
||||
}
|
||||
|
||||
// SearchStudios searches for studios by query string
|
||||
func (s *Scraper) SearchStudios(ctx context.Context, query string) ([]model.Studio, error) {
|
||||
// TODO: Implement TPDB studio search
|
||||
return nil, nil
|
||||
params := url.Values{}
|
||||
params.Set("q", query)
|
||||
|
||||
body, err := s.client.get(ctx, "/sites", params)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to search studios: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbStudios []StudioResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbStudios); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse studios: %w", err)
|
||||
}
|
||||
|
||||
studios := make([]model.Studio, 0, len(tpdbStudios))
|
||||
for _, st := range tpdbStudios {
|
||||
studios = append(studios, mapStudio(st))
|
||||
}
|
||||
|
||||
return studios, nil
|
||||
}
|
||||
|
||||
// SearchScenes searches for scenes by query string
|
||||
func (s *Scraper) SearchScenes(ctx context.Context, query string) ([]model.Scene, error) {
|
||||
// TODO: Implement TPDB scene search
|
||||
return nil, nil
|
||||
params := url.Values{}
|
||||
params.Set("q", query)
|
||||
|
||||
body, err := s.client.get(ctx, "/scenes", params)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to search scenes: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbScenes []SceneResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbScenes); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse scenes: %w", err)
|
||||
}
|
||||
|
||||
scenes := make([]model.Scene, 0, len(tpdbScenes))
|
||||
for _, sc := range tpdbScenes {
|
||||
scenes = append(scenes, mapScene(sc))
|
||||
}
|
||||
|
||||
return scenes, nil
|
||||
}
|
||||
|
||||
// GetSceneByID retrieves a scene by its remote ID
|
||||
func (s *Scraper) GetSceneByID(ctx context.Context, remoteID string) (*model.Scene, error) {
|
||||
// TODO: Implement TPDB scene by ID
|
||||
return nil, nil
|
||||
body, err := s.client.get(ctx, "/scenes/"+remoteID, nil)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get scene: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbScene SceneResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbScene); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse scene: %w", err)
|
||||
}
|
||||
|
||||
scene := mapScene(tpdbScene)
|
||||
return &scene, nil
|
||||
}
|
||||
|
||||
// GetPerformerByID retrieves a performer by its remote ID
|
||||
func (s *Scraper) GetPerformerByID(ctx context.Context, remoteID string) (*model.Performer, error) {
|
||||
// TODO: Implement TPDB performer by ID
|
||||
return nil, nil
|
||||
body, err := s.client.get(ctx, "/performers/"+remoteID, nil)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get performer: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbPerformer PerformerResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbPerformer); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse performer: %w", err)
|
||||
}
|
||||
|
||||
performer := mapPerformer(tpdbPerformer)
|
||||
return &performer, nil
|
||||
}
|
||||
|
||||
// GetStudioByID retrieves a studio by its remote ID
|
||||
func (s *Scraper) GetStudioByID(ctx context.Context, remoteID string) (*model.Studio, error) {
|
||||
// TODO: Implement TPDB studio by ID
|
||||
return nil, nil
|
||||
body, err := s.client.get(ctx, "/sites/"+remoteID, nil)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get studio: %w", err)
|
||||
}
|
||||
|
||||
var apiResp APIResponse
|
||||
if err := json.Unmarshal(body, &apiResp); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse response: %w", err)
|
||||
}
|
||||
|
||||
var tpdbStudio StudioResponse
|
||||
if err := json.Unmarshal(apiResp.Data, &tpdbStudio); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse studio: %w", err)
|
||||
}
|
||||
|
||||
studio := mapStudio(tpdbStudio)
|
||||
return &studio, nil
|
||||
}
|
||||
|
|
|
|||
102
internal/scraper/tpdb/types.go
Normal file
102
internal/scraper/tpdb/types.go
Normal file
|
|
@ -0,0 +1,102 @@
|
|||
package tpdb
|
||||
|
||||
import "encoding/json"
|
||||
|
||||
// APIResponse is the standard TPDB API response wrapper
|
||||
type APIResponse struct {
|
||||
Data json.RawMessage `json:"data"`
|
||||
Meta *MetaData `json:"meta,omitempty"`
|
||||
}
|
||||
|
||||
// MetaData contains pagination information
|
||||
type MetaData struct {
|
||||
CurrentPage int `json:"current_page"`
|
||||
From int `json:"from"`
|
||||
LastPage int `json:"last_page"`
|
||||
PerPage int `json:"per_page"`
|
||||
To int `json:"to"`
|
||||
Total int `json:"total"`
|
||||
}
|
||||
|
||||
// PerformerResponse represents a TPDB performer
|
||||
type PerformerResponse struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Slug string `json:"slug"`
|
||||
Gender string `json:"gender"`
|
||||
Aliases []string `json:"aliases"`
|
||||
Birthday *string `json:"birthday"`
|
||||
Astrology *string `json:"astrology"`
|
||||
Birthplace *string `json:"birthplace"`
|
||||
Ethnicity *string `json:"ethnicity"`
|
||||
Nationality *string `json:"nationality"`
|
||||
EyeColor *string `json:"eye_color"`
|
||||
HairColor *string `json:"hair_color"`
|
||||
Height *int `json:"height"`
|
||||
Weight *int `json:"weight"`
|
||||
Measurements *string `json:"measurements"`
|
||||
TattooDescription *string `json:"tattoo_description"`
|
||||
PiercingDescription *string `json:"piercing_description"`
|
||||
BoobJob *string `json:"boob_job"`
|
||||
Bio *string `json:"bio"`
|
||||
Active *int `json:"active"`
|
||||
Image *string `json:"image"`
|
||||
Poster *string `json:"poster"`
|
||||
}
|
||||
|
||||
// StudioResponse represents a TPDB studio/site
|
||||
type StudioResponse struct {
|
||||
ID int `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Slug string `json:"slug"`
|
||||
URL *string `json:"url"`
|
||||
Description *string `json:"description"`
|
||||
Logo *string `json:"logo"`
|
||||
Parent *ParentStudio `json:"parent"`
|
||||
}
|
||||
|
||||
// ParentStudio represents a parent studio/network
|
||||
type ParentStudio struct {
|
||||
ID int `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Slug string `json:"slug"`
|
||||
}
|
||||
|
||||
// SceneResponse represents a TPDB scene
|
||||
type SceneResponse struct {
|
||||
ID string `json:"id"`
|
||||
UUID string `json:"uuid"`
|
||||
Title string `json:"title"`
|
||||
Description *string `json:"description"`
|
||||
URL *string `json:"url"`
|
||||
Date *string `json:"date"`
|
||||
Image *string `json:"image"`
|
||||
Poster *string `json:"poster"`
|
||||
Duration *int `json:"duration"`
|
||||
Director *string `json:"director"`
|
||||
Code *string `json:"code"`
|
||||
Site *SiteInfo `json:"site"`
|
||||
Performers []PerformerInfo `json:"performers"`
|
||||
Tags []TagInfo `json:"tags"`
|
||||
}
|
||||
|
||||
// SiteInfo represents basic site information in a scene response
|
||||
type SiteInfo struct {
|
||||
ID int `json:"id"`
|
||||
Name string `json:"name"`
|
||||
URL *string `json:"url"`
|
||||
}
|
||||
|
||||
// PerformerInfo represents basic performer information in a scene response
|
||||
type PerformerInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Gender *string `json:"gender"`
|
||||
}
|
||||
|
||||
// TagInfo represents tag information in a scene response
|
||||
type TagInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Slug string `json:"slug"`
|
||||
}
|
||||
Loading…
Reference in New Issue
Block a user