Giter Club home page Giter Club logo

bitmagnet-io / bitmagnet Goto Github PK

View Code? Open in Web Editor NEW
1.9K 25.0 69.0 4.76 MB

A self-hosted BitTorrent indexer, DHT crawler, content classifier and torrent search engine with web UI, GraphQL API and Servarr stack integration.

Home Page: https://bitmagnet.io/

License: MIT License

Dockerfile 0.06% Ruby 0.07% HTML 3.88% JavaScript 0.01% SCSS 0.75% Go 85.26% TypeScript 9.67% PLpgSQL 0.31%
bittorrent dht prowlarr radarr selfhosted servarr sonarr torrent torrents torznab

bitmagnet's People

Contributors

chkuendig avatar deining avatar dependabot[bot] avatar dr4gn3l avatar josteink avatar mgdigital avatar myyc avatar niklasbr avatar ornias1993 avatar poruta99 avatar realfascinated avatar sweetbbak avatar technetium1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bitmagnet's Issues

Enter button not recognised in search input on iOS

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
Unable to trigger search query.

To Reproduce

  1. using a mobile iOS device
  2. load the web ui
  3. enter a search term in the search box
  4. press enter

Expected behavior
Torrent content should populate as on other devices

General (please complete the following information):

  • Bitmagnet version: v0.4.1
  • OS and version: iPadOS 17.2
  • Browser and version (if issue is with WebUI): Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.2 Safari/605.1.15

Additional context
Same behavior on iPhone.

error persisting torrents: extended protocol limited to 65535 parameters

Please do not open issues asking for general support with setting up the app and configuring Docker, as such requests are too time consuming to handle. All the information you need to get up and running is in the Bitmagnet docs, the Docker docs and on Google. If you'd like to suggest improvements or corrections to the docs then please open a PR. If you think you've found a bug then please help us out by letting us know. ❤️

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
During normal crawling you get error persisting torrents: extended protocol limited to 65535 parameters every few minutes, and the torrents listed won't show up when searched in the web UI.

To Reproduce
Steps to reproduce the behavior:

  1. run the project in docker (docker latest or git HEAD) both produce this error.

Expected behavior
Persist torrents.

General (please complete the following information):

  • Bitmagnet version: git HEAD
  • OS and version: Alpine Linux v3.19
  • Browser and version (if issue is with WebUI): N/A

Additional context

bitmagnet           | ERROR     gorm    gorm/logger.go:72       gorm trace      {"location": "/build/internal/database/dao/torrents.gen.go:691", "error": "extended protocol limited to 65535 parameters", "elapsed": 194.518714, "sql": "INSERT INTO \"torrents\" (\"info_hash\",\"name\",\"size\",\"private\",\"piece_length\",\"pieces\",\"created_at\",\"updated_at\",\"files_status\") VALUES ('<binary>','NAME',2242728,false,NULL,'','2024-01-25 20:57:20.868','2024-01-25 20:57:20.868','multi'), *** more stuff like before *** ON CONFLICT (\"info_hash\") DO UPDATE SET \"name\"=\"excluded\".\"name\",\"files_status\"=\"excluded\".\"files_status\",\"piece_length\"=\"excluded\".\"piece_length\",\"pieces\"=\"excluded\".\"pieces\"", "rows": 100}
bitmagnet           | github.com/bitmagnet-io/bitmagnet/internal/database/gorm.(*customLogger).Trace
bitmagnet           |   /build/internal/database/gorm/logger.go:72
bitmagnet           | gorm.io/gorm.(*processor).Execute
bitmagnet           |   /go/pkg/mod/gorm.io/[email protected]/callbacks.go:134
bitmagnet           | gorm.io/gorm.(*DB).CreateInBatches.func1
bitmagnet           |   /go/pkg/mod/gorm.io/[email protected]/finisher_api.go:48
bitmagnet           | gorm.io/gorm.(*DB).Transaction
bitmagnet           |   /go/pkg/mod/gorm.io/[email protected]/finisher_api.go:647
bitmagnet           | gorm.io/gorm.(*DB).CreateInBatches
bitmagnet           |   /go/pkg/mod/gorm.io/[email protected]/finisher_api.go:60
bitmagnet           | gorm.io/gen.(*DO).CreateInBatches
bitmagnet           |   /go/pkg/mod/gorm.io/[email protected]/do.go:598
bitmagnet           | github.com/bitmagnet-io/bitmagnet/internal/database/dao.torrentDo.CreateInBatches
bitmagnet           |   /build/internal/database/dao/torrents.gen.go:691
bitmagnet           | github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents
bitmagnet           |   /build/internal/dhtcrawler/persist.go:43
bitmagnet           | ERROR     dht_crawler     dhtcrawler/persist.go:44        error persisting torrents: extended protocol limited to 65535 parameters
bitmagnet           | github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents
bitmagnet           |   /build/internal/dhtcrawler/persist.go:44

Bootstrap nodes are hardcoded via anacrolix/dht library defaults

It appears that currently, there is no way to configure bootstrap nodes, and which bootstrapping nodes are used are decided by the library anacrolix/dht: https://github.com/bitmagnet-io/bitmagnet/blob/main/internal/dht/crawler/crawl_bootstrap_hosts.go#L14

These would be the nodes used:
https://github.com/anacrolix/dht/blob/99d8ef71342b650e2cd25a0d81be8986a2e69221/dht.go#L106-L115

Being able to configure bootstrap nodes would not only be useful and provide control to the user to extend the list, it would also allow users to avoid bootstrapping with certain nodes they don't want.

Ordering of search results in GraphQL API and UI

Gathering this in a single issue although I don't expect it will be implemented in a single PR.

A good starter would be:

  • size
  • torrent created at
  • name
  • seeders, leechers (a bit more complicated needs to get a max from torrents_torrent_sources table)

Will need familiarity with the existing search implementation; currently the domain-specific search code lives here: https://github.com/bitmagnet-io/bitmagnet/tree/main/internal/database/search

So I imagine these files would be called things like:

  • internal/database/search/order_torrent_created_at.go
    etc...

There are a couple of simple examples of ordering here: https://github.com/bitmagnet-io/bitmagnet/blob/main/internal/database/query/options.go#L117

I'd envisage a search parameter with enums that could be passed as an array, e.g. order: [size_desc, created_desc, name_asc].

For the UI a simple dropdown next to the search input would probably suffice for now, with links on the relevant table headers.

Redis DB configuration is not respected

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
Looks like regardless of the redis.db setting, values are written to db:0

To Reproduce
Steps to reproduce the behavior:

  • Set the REDIS_DB env variable to a value different than 0
  • start bitmagnet
  • from the container, verify that bitmagnet config show displays the env value for the redis.db setting
  • listing the keys for the configured db, returns no keys
  • listing keys from the db:0 shows some bitmagnet keys

Expected behavior
Redis are written on the specified db only

General (please complete the following information):

  • Bitmagnet version: 0.0.7
  • OS and version: Arch linux - Docker

Inquiry About Directing Multiple DHT Crawlers to a Single Database

I am considering directing multiple DHT crawlers towards a single database. My primary concern is whether this setup might result in identical data entries being repeatedly written to the database. Additionally, I am curious if there might be other potential issues that could arise from such a configuration.

[Feature Request] crawl through the onion network to prevent abuse reports

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
If you run this program you will start to show up in iknowwhatyoudownload.com and could potentially start receiving abuse reports.

Describe the solution you'd like
Let us route everything through tor if we want to, disabled by default. Make sure dns is proxied to prevent rdns leakage. Either a docker-compose tor container or system level tor service and have this container route through it either via the socks5 proxy on localhost, or just let us have a enviroment variable that is respected, like
SOCKS_PROXY=127.0.0.1:9050

services:
  bitmagnet:
    build:
      context: .
    container_name: bitmagnet
    #     image: ghcr.io/bitmagnet-io/bitmagnet:latest
    volumes:
      - ./data/bitmagnet:/tmp
    #ports:
      #- "3333:3333"
      # BitTorrent ports, wont work
      #- "3334:3334/tcp"
      #- "3334:3334/udp"
    restart: unless-stopped
    environment:
      - LOG_LEVEL=debug
      - POSTGRES_HOST=postgres
      - POSTGRES_PASSWORD=postgres
      - REDIS_ADDR=redis:6379
      - SOCKS_PROXY=127.0.0.1:9050
    command:
      - worker
      - run
      - --keys=http_server
      - --keys=queue_server
      - --keys=dht_crawler
    depends_on:
      postgres:
        condition: service_healthy
      redis:
        condition: service_healthy

Describe alternatives you've considered
I figured out how to route this over a vpn but not over tor. What I did was

  1. create a wireguard container with mullvad
  2. set this container as "network mode container" and the container the wireguard mullvad container
    However you cannot expose ports.

Additional context
Im not sure open ports are needed other than performance as the other end would need an open port to serve metainfo to us. If I need to run this for a few months thats fine.

tldr; I want to run the crawler through tor.

webui with an IP other than localhost

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
I want to run the docker container on a 24/7 on a LAN connected client that is headless. and then use http to use webui on laptop when I need it.

Describe the solution you'd like
allow assigning a non-localhost ip

Describe alternatives you've considered
using proxies (like stunnel) to redirect it. works but it is dirty

update:
after using stunnel I see that this webui listens at listen tcp4 0.0.0.0:3333 which works for me but it is better to be able to separate the webui ip (like 127.0.0.1 and lan ip and so on)

Rule-based deletion

many torrents but would be a great addition to project if we could pass a flag to only store data or torrents that are categorized and not store unknown ones

permissions of postgres data folder

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
I want to be able to run this as regular user and access the database (I have added my user to docker group).
when I run this as docker, the postgres data folder gets its permission set as userid=70 and group root.
I can change the owner and mod of data folder but at the next run the postgres folder gets userid of 70 again and its permission gets reset to only user drwx------

Describe the solution you'd like
I want to be able to access the db it is creates, which a better way would be the database to be created the user owned.

Describe alternatives you've considered
changing the permission after every run.
not really elegant.

.

Removing aggregations from filtered results to improve UI responsiveness

Describe the bug
Aggregations are killing performance in the web UI when the number of indexed torrents grows large (that's to say the total counts and faceted counts on the filters). They're a nice convenience but I don't think it's feasible to keep these in. This will involve some quite big UX changes - we'll probably keep a warm cache for the overall totals but you'll lose the counts on any filtered results. We'll also need to remove the total count from Torznab feeds edit: maybe later. The filtered aggregations will still be available in the GraphQL API.

To Reproduce
Index a few million torrents and see that filtering and search in the web UI and Torznab API is slow.

Question about import

Hi and thanks for your tool,
At the moment, i've a huge magnetico DB, is it possible to import it correctly in bitmagner please?
It would help to keep all archived magnet, thanks.

manual import of hashes fails

Thank's for creating this awesome tool.
I've tried to run your example script to import the bittorrent hashes from rarbg, but curl always fails with the following output:

$ sqlite3 -json -batch rarbg_db.sqlite "$(cat rarbg-import.sql)"   | jq -r --indent 0 '.[] | . * { source: "rarbg" } | . + if .imdb != null then { externalIds: { imdb: .imdb } } else {} end | del(.imdb) | del(..|nulls)'   | curl --verbose -H "Content-Type: application/json" --data-binary @- http://localhost:3333/import
*   Trying 127.0.0.1:3333...
* Connected to localhost (127.0.0.1) port 3333 (#0)
> POST /import HTTP/1.1
> Host: localhost:3333
> User-Agent: curl/7.81.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 269567440
> Expect: 100-continue
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 100 Continue
* Send failure: Broken pipe
* Closing connection 0
curl: (55) Send failure: Broken pipe

bitmagnet shows the following logs, before restarting:

github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).flushLocked(0xc001923c80)
	/build/internal/importer/importer.go:187 +0x37
github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).buffer(_, {{0xc0014fe800, 0x5}, {0x0, 0x5, 0x7a, 0xef, 0x5b, 0x56, 0x3b, ...}, ...})
	/build/internal/importer/importer.go:173 +0x185
created by github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).run.func1 in goroutine 23879
	/build/internal/importer/importer.go:158 +0x245

the Json data (with the hashes removed) generated before it get's piped into curl seems reasonable:

{"infoHash":"0000000000000000000000000000000000000000","name":"Daybreak.1993.1080p.AMZN.WEBRip.DDP2.0.x264-BTW","size":5681184768,"contentType":"movie","publishedAt":"2019-10-11T16:32:58.000Z","source":"rarbg","externalIds":{"imdb":"tt0106676"}}
{"infoHash":"0000000000000000000000000000000000000000","name":"Dances.with.Wolves.1990.DC.20th.Anniversary.Edition.2.Discs.1080p.BluRay.AVC.DTS-HD.MA.7.1-FGT","size":62604181504,"contentType":"movie","videoSource":"BluRay","videoModifier":"BRDISK","publishedAt":"2015-05-24T00:35:37.000Z","source":"rarbg"}
{"infoHash":"0000000000000000000000000000000000000000","name":"American.Dad.S19E22.The.Grounch.1080p.DSNP.WEBRip.DDP5.1.x264-NTb[rartv]","size":610271232,"contentType":"tv_show","publishedAt":"2022-12-27T16:33:11.000Z","source":"rarbg","externalIds":{"imdb":"tt0397306"}}
{"infoHash":"0000000000000000000000000000000000000000","name":"Baldurs.Gate.3.v58649-GOG","size":97609842688,"contentType":"game","publishedAt":"2022-11-06T17:46:49.000Z","source":"rarbg"}
{"infoHash":"0000000000000000000000000000000000000000","name":"Operation.Mekong.2016.CHINESE.BRRip.XviD.MP3-VXT","size":1669332992,"contentType":"movie","videoCodec":"XviD","publishedAt":"2019-09-15T01:29:23.000Z","source":"rarbg","externalIds":{"imdb":"tt6044910"}}

manually sending a request with a single json object returns a sucess message, but the data doesn't seem to get added (unavailible in the web dashboard and there are no logs showing that anything is happening)

$ head -n1 rarbg_db.json | curl --verbose -H "Content-Type: application/json" --data-binary @- http://localhost:3333/import
*   Trying 127.0.0.1:3333...
* Connected to localhost (127.0.0.1) port 3333 (#0)
> POST /import HTTP/1.1
> Host: localhost:3333
> User-Agent: curl/7.81.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 198
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Vary: Origin
< Date: Thu, 05 Oct 2023 14:09:58 GMT
< Content-Length: 17
< Content-Type: text/plain; charset=utf-8
<
1 items imported
* Connection #0 to host localhost left intact

Content type detection based on torrent names and filenames

Is your feature request related to a problem? Please describe.
Content type detection is currently implemented only for movies, TV shows and some XXX content known by TMDB.

Describe the solution you'd like
We could make some good guesses about content type based on keywords in the torrent name or included files, for example:

  • If a torrent consists of MP3/FLAC files or includes keywords such as "discography" then it's probably music
  • If it includes epub/mobi files then it's probably an ebook
  • If it includes m4b files then it's probably an audiobook
  • If it includes keywords such as "Windows"/"MacOS" then it's probably software
  • If it includes the keyword "XXX" then it's probably a porno

Crash slice bounds out of rang

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
bitmagnet crash with panic :


panic: runtime error: slice bounds out of range [:49152] with capacity 38184

goroutine 5492610 [running]:
github.com/bitmagnet-io/bitmagnet/internal/metainfo/metainforequester.readAllPieces({0x935b360, 0xc000774590}, 0x9528)
        /mnt/podman-volume/bitmagnet/bitmagnet-io/internal/metainfo/metainforequester/requester.go:262 +0x445
github.com/bitmagnet-io/bitmagnet/internal/metainfo/metainforequester.requester.Request.func1()
        /mnt/podman-volume/bitmagnet/bitmagnet-io/internal/metainfo/metainforequester/requester.go:89 +0x1ae
created by github.com/bitmagnet-io/bitmagnet/internal/metainfo/metainforequester.requester.Request in goroutine 5492132
        /mnt/podman-volume/bitmagnet/bitmagnet-io/internal/metainfo/metainforequester/requester.go:67 +0x2e7

To Reproduce
Just wait a torrent with a lot of pieces I guess

Expected behavior
Not crash

General (please complete the following information):

  • Bitmagnet version: v0.0.7
  • OS and version: Ubuntu

Cannot set postgres port from enviroment variable.

Setting POSTGRES_PORT= any value crashes. This is the full output of the command, but I think only the very last line looks relevant. Leaving this environment value as the default seems to work as intended, but my environment requires an alternate port to be used.

command to produce error:
podman run -it --rm -e POSTGRES_PORT=6463 ghcr.io/bitmagnet-io/bitmagnet:latest

[Fx] PROVIDE *cli.Command[group = "commands"] <= github.com/bitmagnet-io/bitmagnet/internal/app/cmd/searchcmd.New() from module "app" [Fx] PROVIDE *cli.Command[group = "commands"] <= github.com/bitmagnet-io/bitmagnet/internal/app/cmd/torrentcmd.New() from module "app" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/webui.New() from module "app" [Fx] PROVIDE hooks.AttachedHooks <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/cli/hooks.New() from module "app_boilerplate" [Fx] PROVIDE *cli.Command[group = "commands"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/app/cmd/config.New() from module "app_boilerplate" [Fx] PROVIDE *cli.Command[group = "commands"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/app/cmd/worker.New() from module "app_boilerplate" [Fx] PROVIDE []string[name = "cli_args"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/cli/args.New() from module "cli" [Fx] PROVIDE *cli.App <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/cli.New() from module "cli" [Fx] PROVIDE config.ResolvedConfig <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config.New() from module "config" [Fx] PROVIDE configresolver.Resolver[group = "config_resolvers"] <= fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func1()} from module "config" [Fx] PROVIDE configresolver.Resolver[group = "config_resolvers"] <= fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func2()} from module "config" [Fx] PROVIDE configresolver.Resolver[group = "config_resolvers"] <= fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func3()} from module "config" [Fx] PROVIDE *health.Health <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/healthcheck.New() from module "healthcheck" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/healthcheck/httpserver.New() from module "healthcheck" [Fx] PROVIDE zap.Config <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging.NewZapConfig() from module "logging" [Fx] PROVIDE *zap.Logger <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging.New() from module "logging" [Fx] PROVIDE *zap.SugaredLogger <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging.New() from module "logging" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:log" [Fx] PROVIDE logging.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:log" [Fx] PROVIDE *validator.Validate <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/validation.New() from module "validation" [Fx] PROVIDE worker.Registry <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/worker.NewRegistry() from module "worker" [Fx] PROVIDE classifier.Classifier <= github.com/bitmagnet-io/bitmagnet/internal/classifier.New() from module "classifier" [Fx] PROVIDE consumer.Consumer[group = "queue_consumers"] <= github.com/bitmagnet-io/bitmagnet/internal/classifier/asynq/consumer.New() from module "classifier" [Fx] PROVIDE producer.Producer[github.com/bitmagnet-io/bitmagnet/internal/classifier/asynq/message.ClassifyTorrentPayload] <= github.com/bitmagnet-io/bitmagnet/internal/classifier/asynq/producer.New() from module "classifier" [Fx] PROVIDE publisher.Publisher[github.com/bitmagnet-io/bitmagnet/internal/classifier/asynq/message.ClassifyTorrentPayload] <= github.com/bitmagnet-io/bitmagnet/internal/classifier/asynq/publisher.New() from module "classifier" [Fx] PROVIDE resolver.SubResolver[group = "content_resolvers"] <= github.com/bitmagnet-io/bitmagnet/internal/classifier/resolver/video.New() from module "classifier" [Fx] PROVIDE *tmdb.Client <= github.com/bitmagnet-io/bitmagnet/internal/classifier/video/tmdb.New() from module "movie" [Fx] PROVIDE tmdb.Client <= github.com/bitmagnet-io/bitmagnet/internal/classifier/video/tmdb.New() from module "movie" [Fx] PROVIDE resolver.RootResolver <= github.com/bitmagnet-io/bitmagnet/internal/classifier/resolver.New() from module "movie" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:tmdb" [Fx] PROVIDE tmdb.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:tmdb" [Fx] PROVIDE server.Server <= github.com/bitmagnet-io/bitmagnet/internal/dht/server.New() from module "dht" [Fx] PROVIDE fx.Hook[group = "app_hooks"] <= github.com/bitmagnet-io/bitmagnet/internal/dht/server.New() from module "dht" [Fx] PROVIDE crawler.Crawler <= github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.New() from module "dht" [Fx] PROVIDE worker.Worker[group = "workers"] <= github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.New() from module "dht" [Fx] PROVIDE health.Option[group = "healthcheck_options"] <= github.com/bitmagnet-io/bitmagnet/internal/dht/healthcheck.New() from module "dht" [Fx] PROVIDE staging.Staging <= github.com/bitmagnet-io/bitmagnet/internal/dht/staging.New() from module "dht" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:dht_crawler" [Fx] PROVIDE dht.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:dht_crawler" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:dht_server" [Fx] PROVIDE server.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:dht_server" [Fx] PROVIDE caches.Cacher <= github.com/bitmagnet-io/bitmagnet/internal/database/cache.NewInMemoryCacher() from module "database" [Fx] PROVIDE *caches.Caches <= github.com/bitmagnet-io/bitmagnet/internal/database/cache.NewPlugin() from module "database" [Fx] PROVIDE *dao.Query <= github.com/bitmagnet-io/bitmagnet/internal/database/dao.New() from module "database" [Fx] PROVIDE *sql.DB <= github.com/bitmagnet-io/bitmagnet/internal/database.New() from module "database" [Fx] PROVIDE *gorm.DB <= github.com/bitmagnet-io/bitmagnet/internal/database.New() from module "database" [Fx] PROVIDE health.Option[group = "healthcheck_options"] <= github.com/bitmagnet-io/bitmagnet/internal/database/healthcheck.New() from module "database" [Fx] PROVIDE persistence.Persistence <= github.com/bitmagnet-io/bitmagnet/internal/database/persistence.New() from module "database" [Fx] PROVIDE gorm.Dialector <= github.com/bitmagnet-io/bitmagnet/internal/database/postgres.New() from module "database" [Fx] PROVIDE search.Search <= github.com/bitmagnet-io/bitmagnet/internal/database/search.New() from module "database" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:postgres" [Fx] PROVIDE postgres.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:postgres" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:gorm_cache" [Fx] PROVIDE cache.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:gorm_cache" [Fx] PROVIDE gql.Config <= github.com/bitmagnet-io/bitmagnet/internal/gql/config.New() from module "graphql" [Fx] PROVIDE graphql.ExecutableSchema <= github.com/bitmagnet-io/bitmagnet/internal/gql.NewExecutableSchema() from module "graphql" [Fx] PROVIDE gql.ResolverRoot <= github.com/bitmagnet-io/bitmagnet/internal/gql/resolvers.New() from module "graphql" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/gql/httpserver.New() from module "graphql" [Fx] PROVIDE *gin.Engine <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/httpserver.New() from module "http_server" [Fx] PROVIDE *http.Server <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/httpserver.New() from module "http_server" [Fx] PROVIDE worker.Worker[group = "workers"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/httpserver.New() from module "http_server" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/boilerplate/httpserver/cors.New() from module "http_server" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:http_server" [Fx] PROVIDE httpserver.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:http_server" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/importer/httpserver.New() from module "importer" [Fx] PROVIDE importer.Importer <= github.com/bitmagnet-io/bitmagnet/internal/importer.New() from module "importer" [Fx] PROVIDE metainforequester.Requester <= github.com/bitmagnet-io/bitmagnet/internal/metainfo/metainforequester.New() from module "metainfo" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:metainfo_requester" [Fx] PROVIDE metainforequester.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:metainfo_requester" [Fx] PROVIDE *asynq.Client <= github.com/bitmagnet-io/bitmagnet/internal/queue/client.New() from module "subscriber" [Fx] PROVIDE *asynq.Server <= github.com/bitmagnet-io/bitmagnet/internal/queue/server.New() from module "subscriber" [Fx] PROVIDE *asynq.ServeMux <= github.com/bitmagnet-io/bitmagnet/internal/queue/server.New() from module "subscriber" [Fx] PROVIDE worker.Worker[group = "workers"] <= github.com/bitmagnet-io/bitmagnet/internal/queue/server.New() from module "subscriber" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:queue" [Fx] PROVIDE queue.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:queue" [Fx] PROVIDE *redis.Client <= github.com/bitmagnet-io/bitmagnet/internal/redis/redisfx.New.func1() from module "redis" [Fx] PROVIDE health.Option[group = "healthcheck_options"] <= github.com/bitmagnet-io/bitmagnet/internal/redis/healthcheck.New() from module "redis" [Fx] PROVIDE config.Spec[group = "config_specs"] <= fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:redis" [Fx] PROVIDE redisconfig.Config <= fx.Annotated{Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func2()} from module "config:redis" [Fx] PROVIDE torznab.Client <= github.com/bitmagnet-io/bitmagnet/internal/torznab/adapter.New() from module "torznab" [Fx] PROVIDE httpserver.Option[group = "http_server_options"] <= github.com/bitmagnet-io/bitmagnet/internal/torznab/httpserver.New() from module "torznab" [Fx] PROVIDE health.Option[group = "healthcheck_options"] <= github.com/bitmagnet-io/bitmagnet/internal/version/healthcheck.New() from module "version" [Fx] PROVIDE fx.Lifecycle <= go.uber.org/fx.New.func1() [Fx] PROVIDE fx.Shutdowner <= go.uber.org/fx.(*App).shutdowner-fm() [Fx] PROVIDE fx.DotGraph <= go.uber.org/fx.(*App).dotGraph-fm() [Fx] DECORATE *gorm.DB <= github.com/bitmagnet-io/bitmagnet/internal/database/migrations.NewDecorator() from module "app" [Fx] DECORATE *gorm.DB <= github.com/bitmagnet-io/bitmagnet/internal/database/cache.NewDecorator() from module "database" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:log" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:tmdb" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:dht_crawler" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:dht_server" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:postgres" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:gorm_cache" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:http_server" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:metainfo_requester" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:queue" [Fx] RUN provide: fx.Annotated{Group: "config_specs", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.NewConfigModule[...].func1()} from module "config:redis" [Fx] RUN provide: fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func1()} from module "config" [Fx] RUN provide: fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func2()} from module "config" [Fx] RUN provide: fx.Annotated{Group: "config_resolvers", Target: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx.New.func3()} from module "config" [Fx] RUN provide: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/validation.New() from module "validation" [Fx] RUN provide: github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config.New() from module "config" [Fx] Error returned: received non-nil error from function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config".New /build/internal/boilerplate/config/config.go:28: error coercing env key 'POSTGRES_PORT' with value '6463' to type uint: cannot coerce value to type uint [Fx] ERROR Failed to initialize custom logger: could not build arguments for function "go.uber.org/fx".(*module).constructCustomLogger.func2 /go/pkg/mod/go.uber.org/[email protected]/module.go:251: failed to build fxevent.Logger: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging/loggingfx".WithLogger.func1 /build/internal/boilerplate/logging/loggingfx/module.go:22: failed to build *zap.Logger: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging".New /build/internal/boilerplate/logging/logger.go:19: failed to build zap.Config: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging".NewZapConfig /build/internal/boilerplate/logging/config.go:24: failed to build logging.Config: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx".NewConfigModule[...].func2 /build/internal/boilerplate/config/configfx/factory.go:25: failed to build config.ResolvedConfig: received non-nil error from function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config".New /build/internal/boilerplate/config/config.go:28: error coercing env key 'POSTGRES_PORT' with value '6463' to type uint: cannot coerce value to type uint [Fx] ERROR Failed to start: could not build arguments for function "go.uber.org/fx".(*module).constructCustomLogger.func2 /go/pkg/mod/go.uber.org/[email protected]/module.go:251: failed to build fxevent.Logger: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging/loggingfx".WithLogger.func1 /build/internal/boilerplate/logging/loggingfx/module.go:22: failed to build *zap.Logger: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging".New /build/internal/boilerplate/logging/logger.go:19: failed to build zap.Config: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/logging".NewZapConfig /build/internal/boilerplate/logging/config.go:24: failed to build logging.Config: could not build arguments for function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config/configfx".NewConfigModule[...].func2 /build/internal/boilerplate/config/configfx/factory.go:25: failed to build config.ResolvedConfig: received non-nil error from function "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/config".New /build/internal/boilerplate/config/config.go:28: error coercing env key 'POSTGRES_PORT' with value '6463' to type uint: cannot coerce value to type uint

Non-compliant DHT implementation

func (s *server) handleRecvMessage(msg RecvMsg) {
s.mutex.Lock()
defer s.mutex.Unlock()
transactionId := msg.Reply.T
ch, ok := s.queries[transactionId]
if ok {
ch <- msg
}
}

This looks like it's not servicing any incoming requests. This is obviously a not-spec-compliant DHT implementation which means it's harmful to a P2P network because it uses resources without providing services.

If you expect many people to actually use this software this will lead to abuse of the network which will mean compliant implementations will have to implement defenses to detect such behavior and not cooperate with such peers.

I recommend writing a proper BEP5 implementation as a base (or using a library) and then adding BEP51 support on top.

The same applies to downloading torrent metadata. If they're stored on disk then such a server should also be able to serve the metadata to other clients if they ask.

error persisting torrents: extended protocol limited to 65535 parameters

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
In logs can see PostgreSQL error:

ERROR        gorm        gorm/logger.go:72        gorm trace        {"location": "github.com/bitmagnet-io/bitmagnet/internal/database/dao/torrents.gen.go:691", "error": "extended protocol limited to 65535 parameters", "elapsed": 293.960545, "sql": "INSERT INTO \"torrents\" (\"info_hash\",\"name\",\"size\",\"private\",\"piece_length\",\"pieces\",\"created_at\",\"updated_at\",\"files_status\") VALUES ('<binary>',[...] DO UPDATE SET \"name\"=\"excluded\".\"name\",\"files_status\"=\"excluded\".\"files_status\",\"piece_length\"=\"excluded\".\"piece_length\",\"pieces\"=\"excluded\".\"pieces\"", "rows": 20}
github.com/bitmagnet-io/bitmagnet/internal/database/gorm.(*customLogger).Trace
        github.com/bitmagnet-io/bitmagnet/internal/database/gorm/logger.go:72
gorm.io/gorm.(*processor).Execute
        gorm.io/[email protected]/callbacks.go:134
gorm.io/gorm.(*DB).CreateInBatches.func1
        gorm.io/[email protected]/finisher_api.go:48
gorm.io/gorm.(*DB).Transaction
        gorm.io/[email protected]/finisher_api.go:647
gorm.io/gorm.(*DB).CreateInBatches
        gorm.io/[email protected]/finisher_api.go:60
gorm.io/gen.(*DO).CreateInBatches
        gorm.io/[email protected]/do.go:598
github.com/bitmagnet-io/bitmagnet/internal/database/dao.torrentDo.CreateInBatches
        github.com/bitmagnet-io/bitmagnet/internal/database/dao/torrents.gen.go:691
github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents
        github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler/persist.go:43
ERROR        dht_crawler        dhtcrawler/persist.go:44        error persisting torrents: extended protocol limited to 65535 parameters
github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents
        github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler/persist.go:44

To Reproduce

Not sure, but maybe due to save_pieces: true.

Expected behavior

No errors :)

General (please complete the following information):

  • Bitmagnet version: latest git v0.5.1
  • OS and version: Arch Linux

Crash with nil reference

Got this while running 0.0.5:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x68 pc=0x11e6283]

goroutine 59188 [running]:
github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).requestPeersForHash.func1({0x19aaa38, 0xc0004de410})
	github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/request_peers.go:56 +0x243
github.com/bitmagnet-io/bitmagnet/internal/dht/routing.(*peer).doLocked(0xc00173ca80, {0x19aaa38, 0xc0004de410}, 0xc0017eaea0)
	github.com/bitmagnet-io/bitmagnet/internal/dht/routing/routing.go:271 +0x64
github.com/bitmagnet-io/bitmagnet/internal/dht/routing.(*peer).WithLock(0xc00173ca80, {0x19aaa38, 0xc0004de410}, 0x11?)
	github.com/bitmagnet-io/bitmagnet/internal/dht/routing/routing.go:248 +0x12d
github.com/bitmagnet-io/bitmagnet/internal/dht/routing.(*table).WithPeer(0xc000607c50, {0x19aaa38, 0xc0004de410}, {{0xc00180a380?, 0x2?, 0x800?}, 0xc00130f860?}, 0xc00130f840?)
	github.com/bitmagnet-io/bitmagnet/internal/dht/routing/routing.go:198 +0x165
github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).requestPeersForHash(0xc0001eee60, {0x19aaa38, 0xc0004de410}, {{0x3e, 0x74, 0x1d, 0x20, 0x6e, 0x97, 0x71, ...}, ...}, ...)
	github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/request_peers.go:38 +0x127
github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).handleStagingRequest(0xc0001eee60, {0x19aaa38, 0xc0004de410}, {{{0x3e, 0x74, 0x1d, 0x20, 0x6e, 0x97, 0x71, ...}, ...}, ...})
	github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/handle_staging_request.go:44 +0x2e8
created by github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).awaitInfoHashes
	github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/handle_staging_request.go:19 +0x4d

crawling given info hashes

is their a way i can configure it to crawl only given hash info:
Use case:

I can just input info hashes from dumps.
i can get info hashes from rss feeds or other source.

so after importing only hashes i want to be able to crawl about these hashes and update these entries with file name and seeders leechers etc.

this way after sharing info hash and name dumps this software could be used to classify and fill info like file name etc

Language detection glitches in video classifier

Describe the bug
We get quite a few incorrect language detections from the video classifier (especially for TV shows) due to how it looks for 3-letter ISO language codes in the section after the episode number.

To Reproduce
If you have "Series Name EP05E01 Episode Name Includes San Francisco" then this gets detected as being Sanskrit, or something with "Mac" in the episode name would be detected as Macedonian. Incidentally I'm not sure if Sanskrit should even be a possibility here as I think it's an ancient written language...?

Expected behavior
3-letter language codes should not be confused with 3 letter words that happen to be a language code. Ideally this should not be at the cost of missing genuine language codes. Perhaps something is needed to detect the episode name part though it could be tricky to do this reliably.

Ability to crawl Prowlarr indexer proxy

Is your feature request related to a problem? Please describe.
The Prowlarr API provides a common interface to many popular torrent indexers that could be crawled by Bitmagnet.

Describe the solution you'd like
Bitmagnet could be configured to crawl specified indexers configured in Prowlarr, for example all indexers tagged "bitmagnet". We'd need to consider questions like rate limiting, how often to poll for new content, how far back in the past to go etc...

Running for a long time causes socket fd leak

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
After running for a while, the "closed" TCP connection count only goes up, until I manually restart the docker container which bitmagnet is running inside. (could be over 10k after a few hours)
ss -s status

To Reproduce
Steps to reproduce the behavior:

  1. Run it long enough

Expected behavior
"closed" TCP connection count remain low.

General (please complete the following information):

  • Bitmagnet version: 0.0.7
  • OS and version: Arch Linux (kernel 6.1.59)
  • Browser and version (if issue is with WebUI): Firefox 118.02

Additional context
None

docker-compose error

Used git to clone dir, then ran
docker-compose up d

got this error

ERROR: yaml.scanner.ScannerError: mapping values are not allowed here
  in "./../docker-compose.yaml", line 4, column 41

Not sure what I'm doing wrong... I'm using the included example docker-compose.dev.yml file

Language detection

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
As a user I would like to classify torrents by language, in order to index and search for torrents in my preferred languages.

Describe the solution you'd like
Language detection for movie and TV content is already reasonably accurate; for other miscellaneous torrents we might consider using https://github.com/pemistahl/lingua-go. Any integration would need to consider accuracy and resource usage.

Removing the `LIKE` part of the search query and normalising unicode characters

Describe the bug
Searching with a query string is slow when there are many indexed torrents.

Currently search queries get transformed to 3 conditions like this:

tsv @@ plainto_tsquery('simple', normalized_input)
tsv @@ websearch_to_tsquery('simple', normalized_input)
search_string LIKE raw_input

This is obviously not great and the only reason it's like this is I didn't get round to addressing this before releasing the first alpha, and it works fairly acceptably until you've got a few million torrents indexed...

The reason for the LIKE element is that to_tsvector('simple', search_string) was discarding unicode characters, breaking search in Asian and other unicode languages. It turns out Postgres doesn't offer a good way to convert a string to a tsvector that works well for all languages. Unfortunately the LIKE element of the query is by far the slowest.

Proposed solution
My proposed solution is to normalise all unicode characters within search strings and input query strings, using a library such as https://github.com/mozillazg/go-unidecode; we'll then have no unicode chars in the search_string or tsv fields, and the LIKE query can be removed, along with the supporting GIST index.

With this done, there'll need to be an easy way of running a one-time job to update the search_string fields with the new normalised string. A one-time job to reclassify everything should do the trick (though as a separate issue we also need a way to manually trigger a bulk reclassify)...

Peer protocol support: Service meta info requests

Is your feature request related to a problem? Please describe.
The app currently has no peer protocol implementation for incoming requests.

Describe the solution you'd like
As a first step, the app could service meta info requests for a limited number of meta infos kept in memory.

Ability to configure displayed content types

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
No

Describe the solution you'd like
Give option to fully hide XXX section from the web ui either by setting env variable in docker-compose or switch from web UI

Describe alternatives you've considered

Additional context
I'm not fan of XXX content and when running on cloud provider do not want store or see this kinda of content would be great if XXX content could be disable from showing on UI side.

Support for BitTorrent v2 file hash search

As I understand, BitTorrent v2 support is planned for future release. Could you please specify the features you have in mind? Additionally, will it be possible to search for file hashes (Merkle root) extracted from hybrid and v2 files?

Wonderful project. I am hopeful for its longevity.

Fails to build with a go.sum mismatch for the asynqmon project

go-task build results in

verifying github.com/hibiken/[email protected]: checksum mismatch
	downloaded: h1:EfLRppj5GlklMPzdCjdonpXz/D23meW0Pk6NAtkOPhw=
	go.sum:     h1:YohWgTIPwtMyZ6khBDcVUz9BdSdQW2Dxn8SoxtbmjSg=

SECURITY ERROR
This download does NOT match an earlier download recorded in go.sum.
The bits may have been replaced on the origin server, or an attacker may
have intercepted the download attempt.

For more information, see 'go help module-auth'.
task: Failed to run task "build": exit status 1

looks like this has been reported against the asynqmon project here hibiken/asynqmon#318

My build script is somewhat atypical and looks like

podman container rm -f bitmanger-build
podman pull fedora:latest
podman container run -d --network host --name bitmanger-build fedora:latest sh -c 'mkfifo /tmp/block ; cat /tmp/block'
podman container exec -t bitmanger-build dnf install -y git go-task go
podman container exec -t bitmanger-build sh -c 'mkdir -p ~/go/src/github.com/bitmagnet-io/'
podman container exec -t bitmanger-build sh -c 'cd ~/go/src/github.com/bitmagnet-io/ && git clone https://github.com/bitmagnet-io/bitmagnet.git' 
podman container exec -t bitmanger-build sh -c 'cd ~/go/src/github.com/bitmagnet-io/bitmagnet && go-task build'
podman container exec -t bitmanger-build sh -c 'cd ~/go/src/github.com/bitmagnet-io/bitmagnet && POSTGRES_HOST=localhost POSTGRES_PASSWORD=postgres go-task migrate'
podman container cp bitmanger-build:/root/go/src/github.com/bitmagnet-io/bitmagnet/bitmagnet /tmp/bitmagnet
podman container rm -f bitmanger-build

and is creating a container based on fedora:latest, and building bitmagnet inside that container

Non-utf8 string in "error creating torrent model"

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug

Occasionally this fails with a stacktrace.

ERROR   dht_staging     staging/staging.go:301  error creating torrent model: invalid utf8 string
github.com/bitmagnet-io/bitmagnet/internal/dht/staging.(*staging).Respond
        github.com/bitmagnet-io/bitmagnet/internal/dht/staging/staging.go:301
github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).handleStagingRequest.func1
        github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/handle_staging_request.go:40
github.com/bitmagnet-io/bitmagnet/internal/dht/crawler.(*crawler).handleStagingRequest
        github.com/bitmagnet-io/bitmagnet/internal/dht/crawler/handle_staging_request.go:77

To Reproduce
Steps to reproduce the behavior:

Unknown

Expected behavior

This error appears without any context.
If this is something that the operator can fix, the error should contain enough context to do that.
If this is a problem with an incoming packet that happens to have non-utf8 strings and can't be fixed, then maybe it can be degraded to a DEBUG/INFO without a stacktrace.

General (please complete the following information):

  • Bitmagnet version: 0.0.7
  • OS and version: nixos
  • Browser and version (if issue is with WebUI):

Blanked out page buttons webui v0.4.1

Can't move through the torrent pages. It only shows the first page. I can change the number of torrents displayed on a page but can't advance. It also shows the number of torrents as 0 however i have north of 2.800.000 torrents

To Reproduce
Steps to reproduce the behavior:

  1. Start minimal docker-compose.yml
  2. Try to change torrent page

Expected behavior
A clear and concise description of what you expected to happen.
Supposed to change the torrent page for example from 1-10 to 11-20

General (please complete the following information):

  • Bitmagnet version: 0.4.1
  • OS and version: Linux
  • Browser and version: Brave browser 1:1.61.109-1

Additional context

Upgraded from v0.3.1

Advanced Search Options

An advanced search feature would greatly improve the usability of bitmagnet by allowing users to craft more precise and complex queries. Here are some suggested search options to implement:

  • Scope Search: Ability to limit the search scope to title only, content only, or both title and content.
  • Field-Specific Search: Ability to search within specific fields like title, description, tags, source, resolution, etc.
  • Exact Phrase Search: Search for an exact sequence of words by enclosing the phrase in double quotes, e.g., "Keyword1 keyword2". For instance, a search for "kw1 kw2" kw3 kw4 would return results containing kw1 and kw2 together (in that order), and kw3 and kw4 appearing anywhere in the result.
  • Boolean Operators: Support for combining keywords with boolean operators (AND &, OR |, NOT -) for more complex queries.
  • Wildcard Matching: Using wildcards like * or ? to match patterns (e.g., s01ep* to match episode numbers).
  • Regular Expression Search: Support for using regular expressions in search queries for advanced pattern matching.
  • Saved Searches: Allow users to save complex search queries for future use.
  • Numeric Range Filters: Filter results based on numeric ranges for fields like date, size, number of files, seeds, etc.

Implementing these features would allow users to craft very precise searches to quickly find their desired content. It would make Bitmagnet's search capabilities highly competitive with other torrent sites. I think they often don't offer these features on websites because they are resource-intensive, but they are a good fit for a self-hosted program. Please consider adding advanced search options!

There are other ideas for a visual style advanced search instead of syntax based on this issue

Related

dht_crawler.save_pieces configuration option is ignored.

Have you checked the roadmap on the website, and existing issues, before opening a duplicate issue?
Yes

Describe the bug
dht_crawler.save_pieces configuration option is ignored. It looks like it is not referenced at all in persist.go's createTorrentModel function, while the the other space-saving option dht_crawler.save_files_threshold is referenced and respected. This causes the database to grow in size quite a bit faster than before.

To Reproduce
Steps to reproduce the behavior:

  1. Setup bitmagnet from docker
  2. Use the default configuration option for dht_crawler.save_pieces: false or set the option manually.
  3. Check the pieces and piece_length columns in torrents table of the database as things are crawled and see that they are not null.
  4. Watch the database expand as it fills with pieces data

Expected behavior
By default, pieces and piece_length should not be saved and should be null in the database.

General (please complete the following information):

  • Bitmagnet version: v0.1.0 (v0.0.7 seems unaffected)
  • OS and version: Fedora 39
  • Browser and version (if issue is with WebUI): N/A

Additional context
Add any other context about the problem here.

Option to Return Real-Debrid Cached Items/Hashes

Have you checked the roadmap on the website, and existing issues, before opening a duplicate issue?
Yes

Is your feature request related to a problem? Please describe.
I'm apart of several communities that typically only have 2 options when it comes to searching for cached torrent/magnet hashes on a given Debrid service. Options like Orionoid or Torrentio are allowing us to create Plex/Jellyfin/Emby libraries quickly with responses of cached hashes on a given debrid service.

Currently I've written a Torrentio indexer for Prowlarr that's been the closest thing to integrating the *Arr apps with Real-Debrid. We could take this a large step further if we had an application that searches the DHT as a whole and returns results based on "cached status".

Describe the solution you'd like
I think something like a parameter at the end of the bitmagnet url, something like &cached=real-debrid would filter the results by what's been cached on real-debrid (or any of the debrid services). A real-debrid, or other debrid service, api-key would have to be given as well.

Describe alternatives you've considered
As mentioned before, my way of circumventing this was writing a custom indexer for Prowlarr to achieve this. However Torrentio has limitations, and I imagine I may see those same limitations in Orionoid too (I'll be working on this one soon).

  • Torrentio, for searching movies, its wonderful. Works perfectly with the indexer I've made for Prowlarr. However searching for series and even having it "play-nicely" with Sonarr, has also been challenging. A user can only search by a specific episode currently given by an imdbid:season_num:episode_num. Also another thing to note is that Torrentio doesn't offer RSS feeds, and I suspect Orionoid won't either.

  • Orionoid, at first glance, its api is written well but the database of hashes feels a little lackluster compared to Torrentio.

The solution I feel would be an "on-demand" cache checking application that can search many indexers, and return only cached results.

Additional context
Hopefully this falls within scope of your project. Here are a couple resources to look at that may be helpful.

High memory growth over time

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
There seems to be a memory leak / unconstrained growth. The process seems to gain ~a couple hundred MB of used memory every day.

To Reproduce
Steps to reproduce the behavior:

  1. Run the app
  2. Index ~150k entries over 3 days
  3. App takes >1GB of memory

Expected behavior
App takes close to the initial size, or has a configuration option that sets the high bound for caches.

General (please complete the following information):

  • Bitmagnet version: 0.0.7
  • OS and version: nixos

I'm happy to provide more info if there's a good way to get some more detailed numbers out of the app.

allow configured threshold for files in a torrent

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
when I get a big torrent with lots of files I get this:
"Files information was not saved as the number of files is over the configured threshold. "

Describe the solution you'd like
allow uesrs to change this threshold.

Describe alternatives you've considered
nothing?

Additional context
is this because the app tries to limit size of db?
if this is changed then torrents got before, get updated with new info of files?

btw if I delete a torrent from list, does it get added back to db after app restart?
or is it forever gone?

Add version number to web page

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Is your feature request related to a problem? Please describe.
It's not easy to determine the version of Bitmagnet that is currently running.

Describe the solution you'd like
A version number displayed on the Bitmagnet page - in the menu, along the title bar, bottom left corner, wherever.

Torrent tagging

Is your feature request related to a problem? Please describe.
As a user I would like the ability to add any arbitrary tags to torrents, in order to curate content and add information not included in the domain model.

Describe the solution you'd like
It should be possible to add tags to torrents either via the UI or the API, individually or in bulk*, and to be able to search/filter by these.

This should work with the Torznab integration, allowing a Torznab output filtered by specified tags. Support for saved searches with Torznab will be implemented separately.

  • bulk actions in the UI will be implemented separately

Additional context
Some of this has been discussed here: #35

CLI help doesn't work well without a working configuration

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
Multiple CLI commands for bitmagnet cause numerous errors due to lack of working configuration before/instead of giving the help output expected.

To Reproduce

  1. Run postgres/redis in non-default location.
  2. Have a copy of bitmagnet daemon running.
  3. Run bitmagnet config --help
  4. Get:
ERROR   migrations/decorator.go:31      failed to ping database: failed to connect to `host=localhost user=postgres database=bitmagnet`: server error (FATAL: no pg_hba.conf entry for host "127.0.0.1", user "postgres", database "bitmagnet", no encryption (SQLSTATE 28000))
github.com/bitmagnet-io/bitmagnet/internal/database/migrations.NewDecorator
...
WARN    tmdb_client     tmdb/client.go:40       you are using the default TMDB api key; TMDB requests will be limited to 1 per second; to remove this warning please configure a personal TMDB api key
2023/11/27 10:58:06 Failed to collect metrics data: failed to get queue names: dial tcp [::1]:6379: connect: connection refused
ERROR   fx      fxevent/zap.go:59       OnStart hook failed     {"callee": "github.com/bitmagnet-io/bitmagnet/internal/protocol/dht/server.New.func1()", "caller": "github.com/bitmagnet-io/bitmagnet/internal/boilerplate/cli/hooks.New", "error": "could not open socket: address already in use"}
...
ERROR   fx      fxevent/zap.go:59       start failed, rolling back      {"error": "could not open socket: address already in use"}
...
ERROR   fx      fxevent/zap.go:59       start failed    {"error": "could not open socket: address already in use"}

Expected behavior
Display the help output for that command without any attempts to connect to the databases.

General (please complete the following information):

  • Bitmagnet version: 0.2.0
  • OS and version: nixos

Docker-compose is not working out of the box: FATAL error.

I tried the provided docker-compose and changed the minimum : db password, and ./data/postgres and ./data/redis, paths.

Postgres is giving me endless FATAL: no pg_hba.conf entry for host "192.168.176.4", user "postgres", database "bitmagnet", no encryption

I found nothing about this issue in the docs or Google.

Any guidance would really be appreciated.

Could bitmagnet be causing mass private IP scanning

I recently got email from Hetzner about attack on the server would it be possible that Bitmagnet mass scan private IP's if port is not forwarded?

> TIME                               SRC           SRC-PORT  ->  DST              DST-PORT  SIZE  PROT
> ----------------------------------------------------------------------------------------------------------
> 2023-12-05 17:17:27.274984369 UTC  SERVERPUBLIIP      3334  ->         10.0.0.2     43790   149  UDP
> 2023-12-05 17:15:08.885498594 UTC  SERVERPUBLIIP      60760  ->        10.0.0.13     50413    78  TCP
> 2023-12-05 17:58:27.953682074 UTC  SERVERPUBLIIP      45130  ->       10.0.0.138      2450    78  TCP
> 2023-12-05 17:58:30.961633233 UTC  SERVERPUBLIIP      51848  ->       10.0.0.138      4498    78  TCP

IPV6 support

Is your feature request related to a problem? Please describe.
The DHT server and crawler currently support IPV4 only.

Describe the solution you'd like
IPV6 should be supported in addition to IPV4.

Container starts before database is ready

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
An SQL error is shown when TMDB_API_KEY is not provided.

To Reproduce
Steps to reproduce the behavior:

  1. Create new bitmagnet instance without an API key
  2. Go to the web UI.

Expected behavior
It should have a more descriptive error message that says that TMDB_API_KEY is missing.

General (please complete the following information):

  • Bitmagnet version: 0.1.1
  • OS and version: Docker on Debian 12.

Additional context

Screenshot:

Snapshot_2023-11-15_13-39-44

SQL Error in logs

Have you checked the roadmap on the website, and existing issues, before opening a dupllcate issue?
Yes

Describe the bug
Sometimes I got ERROR: invalid byte sequence for encoding \"UTF8\": 0x00 (SQLSTATE 22021) in logs
And after that error persisting torrents: ERROR: deadlock detected (SQLSTATE 40P01); ERROR: deadlock detected (SQLSTATE 40P01)

To Reproduce
Wait for it

Expected behavior
No error

General (please complete the following information):

  • Bitmagnet version: 0.3.1
  • OS and version: docker from GH

Additional context
Here is logs

ERROR	gorm	gorm/logger.go:72	gorm trace	{"location": "/build/internal/database/dao/torrents.gen.go:616", "error": "ERROR: deadlock detected (SQLSTATE 40P01)", "elapsed": 1011.070165, "sql": "INSERT INTO \"torrents_torrent_sources\" (\"source\",\"info_hash\",\"import_id\",\"bfsd\",\"bfpe\",\"seeders\",\"leechers\",\"published_at\",\"created_at\",\"updated_at\") VALUES ('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht',':)\ufffd\ufffd\ufffd89\ufffd==\ufffdI-Qh\ufffd\ufffd=6a',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','8\ufffd\ufffdiP\ufffd<=\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdoH\ufffd5',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','\ufffd\ufffdŇ̑\ufffdxq\ufffd\ufffdٛ\ufffdo\ufffd9\ufffd\ufffd\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','|\ufffd&\"\ufffd]J\ufffd3\ufffd\ufffdGˆ\ufffd\ufffd\ufffd\ufffd*\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','\ufffdY\ufffd2\ufffd\ufffd\ufffd4\ufffd\ufffd\ufffd=\ufffd\ufffdUXؖa\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','\ufffdNa\ufffdA\ufffdH.c\ufffd\ufffdx|\ufffdo\ufffdXm\ufffd\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','7p(d#\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdb\ufffd\ufffdz\ufffd3\ufffd\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','\ufffd\ufffdZ\ufffd\ufffd\ufffd,/,z\ufffdh\ufffd=L\ufffdnT\ufffd\ufffd',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','\ufffd\ufffdңX\ufffd\ufffd\ufffd*e\ufffd\ufffd&\ufffd\ufffdkF$tL',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604'),('dht','<binary>',NULL,'','',NULL,NULL,'0000-00-00 00:00:00','2023-12-10 11:54:10.604','2023-12-10 11:54:10.604') ON CONFLICT (\"source\",\"info_hash\") DO UPDATE SET \"info_hash\"=\"excluded\".\"info_hash\"", "rows": 0}

github.com/bitmagnet-io/bitmagnet/internal/database/gorm.(*customLogger).Trace

	/build/internal/database/gorm/logger.go:72

gorm.io/gorm.(*processor).Execute

	/go/pkg/mod/gorm.io/[email protected]/callbacks.go:134

gorm.io/gorm.(*DB).Create

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:24

gorm.io/gorm/callbacks.saveAssociations

	/go/pkg/mod/gorm.io/[email protected]/callbacks/associations.go:431

gorm.io/gorm/callbacks.RegisterDefaultCallbacks.SaveAfterAssociations.func4

	/go/pkg/mod/gorm.io/[email protected]/callbacks/associations.go:258

gorm.io/gorm.(*processor).Execute

	/go/pkg/mod/gorm.io/[email protected]/callbacks.go:130

gorm.io/gorm.(*DB).CreateInBatches.func1

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:48

gorm.io/gorm.(*DB).Transaction

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:647

gorm.io/gorm.(*DB).CreateInBatches

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:60

gorm.io/gen.(*DO).CreateInBatches

	/go/pkg/mod/gorm.io/[email protected]/do.go:598

github.com/bitmagnet-io/bitmagnet/internal/database/dao.torrentDo.CreateInBatches

	/build/internal/database/dao/torrents.gen.go:616

github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents

	/build/internal/dhtcrawler/persist.go:43

ERROR	gorm	gorm/logger.go:72	gorm trace	{"location": "/build/internal/database/dao/torrents.gen.go:616", "error": "ERROR: deadlock detected (SQLSTATE 40P01); ERROR: deadlock detected (SQLSTATE 40P01)", "elapsed": 1122.995223, "sql": "INSERT INTO \"torrents\" (\"info_hash\",\"name\",\"size\",\"private\",\"piece_length\",\"pieces\",\"search_string\",\"created_at\",\"updated_at\",\"files_status\") VALUES ('<binary>','Guy.Martin.The.Worlds.Fastest.Van.2018.DVDRip.x264-GHOULS[EtMovies]',756304429,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[Wamusato Haru] Shinkon Shimai - CH09 [English Translated by Tonigobe].rar',22198326,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Tangled.2010.1080p.BluRay.x264-CiNEFiLE',8213728863,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Annie Lennox - Discography [FLAC Songs] [PMEDIA] ⭐️',3588211482,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','精东影业168.【国产 剧情】四个女人一台戏.矫情护士装要老命了.藤田美绪-真希波-桥本爱菜- 高桥熏 -高清原版无水印.TS',3329639988,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Sex bus.mp4',498753424,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Apashe & Snails - Bubble Gun.mp3',8521875,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','VAGU-060',838522865,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Univer.10.let.spustya.S01.2021.WEB-DL.1080p',19057824672,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Manga-Zip.info_Koucha Ouji  vol 01-25.rar',1056322851,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Beta_Version-2846-0-0-1-1617872334.rar',526981453,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','zzpp01.com_cemd148',4267648039,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Bebe Rexha - You Can\\'t Stop the Girl (The Ellen Degeneres Show) 2019.ts',314251024,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Cosmid.23.03.27.Bella.Crystal.XXX.720p.HEVC.x265.PRT[XvX]',92313748,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[HYSUB]Golden Kamuy[37~49+SP][BIG5_MP4][1920X1080]',4430208787,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','kpkp69.com_241GAREA-545',2383019342,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','[FPP-02] Underground XXX Tape of an Actress - Ai Misaki',1146048991,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','SIRO-4546',1940500271,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[3D-Hentai] [ソクラテス] 有奈の不思議な処女喪失 ~ 双子のキモ・デブオヤジに支配された世界へ迷い込み ~ [Socrates] Virginity Lost of Yuna in Wonderland',1916267321,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','youiv.net-OME-158.mp4',1329895251,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),(':)\ufffd\ufffd\ufffd89\ufffd==\ufffdI-Qh\ufffd\ufffd=6a','FC2-PPV-3750150',1441943341,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','mods.rar',448931432,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','La.France.Profonde.Special.Professions.2.FRENCH.XXX.DVDRip.XviD-TESORO',1471164416,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('8\ufffd\ufffdiP\ufffd<=\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdoH\ufffd5','IPZ-320,.wmv',1846611425,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Артур Конан Дойл - Детская мировая классика',6849944,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('\ufffd\ufffdŇ̑\ufffdxq\ufffd\ufffdٛ\ufffdo\ufffd9\ufffd\ufffd\ufffd','Nancy.Drew.S02E04.rus.LostFilm.TV.avi',660047872,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Gimenez Bartlett, Alicia - [Petra Delicado 02] Dia de perros [10127] (r1.4).epub',419130,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','exclusiveclub.com_elsa_1_1280x720.wmv',1244264875,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','MommysGirl.19.11.23.Scarlett.Sage.And.Kit.Mercer.Graduation.Gift.XXX.SD.MP4-KLEENEX',319327822,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','ETQR-065.mp4',3707680133,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','TeenFidelity.E369.Grae.Stoke.Air.Slutt.XXX.720p.WEB.x264-GalaXXXy[XvX]',439238244,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','052116_571',2350862856,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','王者级玩家SM界天花板价值千元玩女大师各种调教*母狗',3676565908,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','dn2021-0604.mp4',543018996,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Bluey S02E52 - Easter.mp4',89592278,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','B.B. King & Eric Clapton-2000-Riding with the King (2020 Deluxe Edition)',496468148,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('|\ufffd&\"\ufffd]J\ufffd3\ufffd\ufffdGˆ\ufffd\ufffd\ufffd\ufffd*\ufffd','Растения – твои друзья и недруги.rtf',4487591,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Francuzskij.vestnik.2021.BDRip',2346419615,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','No.Name.on.the.Bullet.1959.BRRip.XviD.MP3-XVID',1042394884,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('\ufffdY\ufffd2\ufffd\ufffd\ufffd4\ufffd\ufffd\ufffd=\ufffd\ufffdUXؖa\ufffd','RSDK0401TT8557',360153476,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Fear.the.Walking.Dead.S07E09.1080p.WEB.H264-CAKES[rarbg]',5512400001,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','第一會所新片@SIS001@(353HEN)(353HEN-013)私服_スク水_女性の全てがオナニーのネタである_013_美竹すずさん_24歳',5406248062,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','New.Didi.Zerati.Priva.te.E.nglish.Less.on.10.07.2023.Hardcore.Roleplay.Bigtits.ILUVY.mp4',576622266,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Маэда Дзюн - Ангельские ритмы! Нулевой трек (Otsuru)',454141845,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','AMD-FORCED-10x64-FirePro_23q3.1_31.0.21023.2010_iCafe-drp.zip',345484420,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','ABF-009.mp4',1631817869,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','SUPA-573',5553378477,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','BrandNewAmateurs.21.03.26.Annie.Archer.Part.2.XXX.1080p.HEVC.x265.PRT',613052063,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('\ufffdNa\ufffdA\ufffdH.c\ufffd\ufffdx|\ufffdo\ufffdXm\ufffd\ufffd','Врата ада - Hellgate (by ale_x2008).avi',1570099200,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Шерлох (Шерлок) (2015)',4693143666,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','kpkp3.com-KCPN025',1595932820,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','The.Nature.of.Things.Season.54.HDTV.x264-CLDD',7095910041,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','绝种好男人.2003.1080P.WEB-DL.X264.AAC.CHS.mp4',2861372634,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Kyo_Kara_Maoh!_76_HD_SUB_ITA_[yaoi801fansub.altervista.org].mp4',185222373,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','MILD707MP4',1689079469,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('7p(d#\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdb\ufffd\ufffdz\ufffd3\ufffd\ufffd','Michael Parenti - 2022 - The Assassination of Julius Caesar (History)',211417212,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('\ufffd\ufffdZ\ufffd\ufffd\ufffd,/,z\ufffdh\ufffd=L\ufffdnT\ufffd\ufffd','Lynda - Database Fundamentals - Storage',223469651,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Spice Girls - Super Hits Collection (2015)',183745607,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','MDYD940C',1348664262,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','TUSHY_100466-MARLEY_720P.mp4',2539506107,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','NaughtyOffice.20.06.08.Skye.Blue.XXX.VR180.2048p.MP4-VACCiNE[rarbg]',11882658695,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[**着名人体模特【汤芳】图片作品全集(高分辨率)',780918452,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','[Nekomoe kissaten][Shiro Seijo to Kuro Bokushi][08][720p][JPTC].mp4',169784878,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','caa0f8133480d1ff33ed2451cd92f45c',9707589348,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','over_threshold'),('<binary>','КХЛ 01.02.2011 Автомобилист-Торпедо.avi',1282621432,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','[ThZu.Cc]mism-108',3951662616,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[email protected]@好色爷爷的性福生活 禁断介護 合集系列 8',4205838709,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Guerreros del Espacio (1984, The Ice Pirates) (Spa.Eng.Subs).mkv',3651921782,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','#_KTDS532',1967218819,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Sudden_Strike_4_1.15.30080_(28385)_win_gog',8750196849,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[Tus libros 106] Dostoyevski, Fiodor - Crimen y castigo (ilustrado) [52003] (r1.0).epub',2302788,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','(DVDMaster) The Fruit is Swelling 1997',3508312064,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Maze (2017) [1080p] [YTS.AG]',1512526857,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','The Art of Anal Sex 4',2938252190,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('\ufffd\ufffdңX\ufffd\ufffd\ufffd*e\ufffd\ufffd&\ufffd\ufffdkF$tL','Nora Roberts-Black Rose AUDIO',316622845,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','whichav.com.磨到高潮的桌角自慰.ts',122441204,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','Teen Creeps - 2021 - Forever (CD)',305043116,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Город мастеров',724901888,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','118877.xyz SSIS-806',5531873069,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','F1.2023.Round.21.Brazilian.Weekend.SkyF1.1080P',7789011871,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Vrachebnaya.Oshibka.2020.WEBRip.shevanlk',2677690368,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Joy - Hello - 1986,(Germany),DSF(tracks),(VM-540MLt(63pF)+MFavb(70kOm)+Rpm-p)',2799094295,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Repo Girl XXX (2004) DVDRip',1465958402,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Стоев Андрей - За последним порогом 6. Паутина Книга 2 (Пожилой Ксеноморф)',474408588,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[Patience] Bungou Stray Dogs - 56 [1080p][3875BD19].mkv',915723567,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','KarupsPC.13.05.31.Kalea.Taylor.Solo.2.XXX.720p.x264-SEXORS[rarbg]',495186505,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Dylan Ryder, Kagney Lynn Carter, Lexi Swallow - Boner League.1280x720.mp4',969222014,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','【入微】AXDVD-092R 女体料理 陵辱フルコース 2',584699765,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Бушков А. - Сварог 17. Вертикальная вода.(чит.Кирсанов С.)',345656968,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','[www.mp4kan.com]t我d电b吧.E08.HD1080p.mp4',1342894657,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','IPX-624.mp4',2834827449,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','single'),('<binary>','21EroticAnal.21.03.12.Alexis.Crystal.Horny.Yoga.XXX.720p.MP4-XXX',437546076,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','NDRA-104_1K',1205233460,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','全新2021顶级时尚女神级美女在想什么呢 原版流出',803244389,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Christian McBride - The Movement Revisited - A Musical Portrait of Four Icons (2020) {Mack Avenue Records} [WEB FLAC 24 -96]',1412482995,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Виль Липатов - Любовь в Старокороткино - 1982',107419533,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi'),('<binary>','Leaves\\' Eyes - 2011 - Meredead (Standard Edition)',540440190,false,NULL,'','','2023-12-10 11:54:10.496','2023-12-10 11:54:10.496','multi') ON CONFLICT (\"info_hash\") DO UPDATE SET \"name\"=\"excluded\".\"name\",\"files_status\"=\"excluded\".\"files_status\",\"piece_length\"=\"excluded\".\"piece_length\",\"pieces\"=\"excluded\".\"pieces\"", "rows": 97}

github.com/bitmagnet-io/bitmagnet/internal/database/gorm.(*customLogger).Trace

	/build/internal/database/gorm/logger.go:72

gorm.io/gorm.(*processor).Execute

	/go/pkg/mod/gorm.io/[email protected]/callbacks.go:134

gorm.io/gorm.(*DB).CreateInBatches.func1

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:48

gorm.io/gorm.(*DB).Transaction

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:647

gorm.io/gorm.(*DB).CreateInBatches

	/go/pkg/mod/gorm.io/[email protected]/finisher_api.go:60

gorm.io/gen.(*DO).CreateInBatches

	/go/pkg/mod/gorm.io/[email protected]/do.go:598

github.com/bitmagnet-io/bitmagnet/internal/database/dao.torrentDo.CreateInBatches

	/build/internal/database/dao/torrents.gen.go:616

github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents

	/build/internal/dhtcrawler/persist.go:43

ERROR	dht_crawler	dhtcrawler/persist.go:44	error persisting torrents: ERROR: deadlock detected (SQLSTATE 40P01); ERROR: deadlock detected (SQLSTATE 40P01)

github.com/bitmagnet-io/bitmagnet/internal/dhtcrawler.(*crawler).runPersistTorrents

	/build/internal/dhtcrawler/persist.go:44

WARN	gorm	gorm/logger.go:76	gorm trace	{"location": "/build/internal/database/dao/torrents_torrent_sources.gen.go:366", "slowLog": "SLOW SQL >= 3s", "elapsed": 3699.619597, "sql": "INSERT INTO \"torrents_torrent_sources\" (\"source\",\"info_hash\",\"import_id\",\"bfsd\",\"bfpe\",\"seeders\",\"leechers\",\"published_at\",\"created_at\",\"updated_at\") VALUES ('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','G\ufffd{\\UMk\ufffd4l\ufffd\ufffdNCV\ufffd%I7:',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,4,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,4,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',6,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,7,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',4,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,7,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,6,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,7,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,9,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,18,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,5,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,3,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,3,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','\ufffd\ufffdo\"\ufffdKًo\ufffd\ufffd\ufffd\ufffdE\ufffd\ufffdb\ufffd[\ufffd',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,4,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,3,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,4,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',6,21,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,4,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',2,30,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','\ufffdyd?\ufffdp\ufffd-\ufffd\ufffd\ufffd]\ufffd\ufffd&6K\ufffd!\ufffd',NULL,'<binary>','<binary>',0,5,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','F<G\ufffd\ufffd\ufffd\ufffd9_wEo֖!槧.\ufffd',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',8,22,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,3,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','A\ufffdԛN.h{\ufffdX\ufffdm\ufffdT\ufffd\ufffd\ufffd0_\ufffd',NULL,'<binary>','<binary>',2,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',4,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,6,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,3,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','\ufffd\ufffd\ufffd-;r\ufffd\ufffdgT\ufffd\ufffd\ufffd\ufffdI\ufffd+mfi',NULL,'<binary>','<binary>',0,6,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',6,5,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,5,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',3,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,2,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,0,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',1,1,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923'),('dht','<binary>',NULL,'<binary>','<binary>',0,12,'0000-00-00 00:00:00','2023-12-10 11:54:07.923','2023-12-10 11:54:07.923') ON CONFLICT (\"source\",\"info_hash\") DO UPDATE SET \"updated_at\"='2023-12-10 11:54:07.923',\"import_id\"=\"excluded\".\"import_id\",\"bfsd\"=\"excluded\".\"bfsd\",\"bfpe\"=\"excluded\".\"bfpe\",\"seeders\"=\"excluded\".\"seeders\",\"leechers\"=\"excluded\".\"leechers\",\"published_at\"=\"excluded\".\"published_at\"", "rows": 100}

WARN	dht_crawler	dhtcrawler/bootstrap.go:20	failed to resolve bootstrap node address: lookup dht.anacrolix.link on 127.0.0.11:53: no such host

ERROR	gorm	gorm/logger.go:72	gorm trace	{"location": "/build/internal/database/dao/torrents.gen.go:616", "error": "ERROR: invalid byte sequence for encoding \"UTF8\": 0x00 (SQLSTATE 22021)", "elapsed": 17.420731, "sql": "INSERT INTO \"torrent_files\" (\"info_hash\",\"index\",\"path\",\"size\",\"created_at\",\"updated_at\") VALUES ('<binary>',0,'Borysenko, Karlyn - Actively Unwoke — 01.mp3',233224,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'Borysenko, Karlyn - Actively Unwoke — 02.mp3',7667080,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'Borysenko, Karlyn - Actively Unwoke — 03.mp3',10383688,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'Borysenko, Karlyn - Actively Unwoke — 04.mp3',26941384,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'Borysenko, Karlyn - Actively Unwoke — 05.mp3',24887944,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',5,'Borysenko, Karlyn - Actively Unwoke — 06.mp3',29264008,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',6,'Borysenko, Karlyn - Actively Unwoke — 07.mp3',19816072,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',7,'Borysenko, Karlyn - Actively Unwoke — 08.mp3',17765896,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',8,'Borysenko, Karlyn - Actively Unwoke — 09.mp3',11628040,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',9,'Borysenko, Karlyn - Actively Unwoke — 10.mp3',9069064,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',10,'Borysenko, Karlyn - Actively Unwoke — 11.mp3',15240712,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',11,'Borysenko, Karlyn - Actively Unwoke — 12.mp3',10248904,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',12,'Borysenko, Karlyn - Actively Unwoke — 13.mp3',2867080,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',13,'Borysenko, Karlyn - Actively Unwoke — 14.mp3',1169992,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'Subtitle/English/War.and.Peace.2016.s01e01.HDTV.1080p.LinkHudi.eng.srt',56667,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'Subtitle/English/War.and.Peace.2016.s01e02.HDTV.1080p.LinkHudi.eng.srt',50015,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'Subtitle/English/War.and.Peace.2016.s01e03.HDTV.1080p.LinkHudi.eng.srt',57076,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'Subtitle/English/War.and.Peace.2016.s01e04.HDTV.1080p.LinkHudi.eng.srt',58713,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'Subtitle/English/War.and.Peace.2016.s01e05.HDTV.1080p.LinkHudi.eng.srt',52144,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',5,'Subtitle/English/War.and.Peace.2016.s01e06.HDTV.1080p.LinkHudi.eng.srt',65914,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',6,'Subtitle/Russian/War.and.Peace.2016.s01e01.HDTV.1080p.LinkHudi.rus.srt',74760,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',7,'Subtitle/Russian/War.and.Peace.2016.s01e02.HDTV.1080p.LinkHudi.rus.srt',63521,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',8,'Subtitle/Russian/War.and.Peace.2016.s01e03.HDTV.1080p.LinkHudi.rus.srt',72877,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',9,'Subtitle/Russian/War.and.Peace.2016.s01e04.HDTV.1080p.LinkHudi.rus.srt',76745,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',10,'Subtitle/Russian/War.and.Peace.2016.s01e05.HDTV.1080p.LinkHudi.rus.srt',63395,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',11,'Subtitle/Russian/War.and.Peace.2016.s01e06.HDTV.1080p.LinkHudi.rus.srt',62271,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',12,'War.and.Peace.2016.s01e01.HDTV.1080p.LinkHudi.mkv',3353454975,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',13,'War.and.Peace.2016.s01e02.HDTV.1080p.LinkHudi.mkv',2394696176,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',14,'War.and.Peace.2016.s01e03.HDTV.1080p.LinkHudi.mkv',2442758234,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',15,'War.and.Peace.2016.s01e04.HDTV.1080p.LinkHudi.mkv',2413460219,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',16,'War.and.Peace.2016.s01e05.HDTV.1080p.LinkHudi.mkv',2340709803,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',17,'War.and.Peace.2016.s01e06.HDTV.1080p.LinkHudi.mkv',5865345052,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'javsubs91.txt',91,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'702NOSKN-043.mp4',1587506156,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'www.자료천국.com - 최초배포.txt',32,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'위기의 X E06.END.2022.1080P.H264-FOA.mkv',1300844719,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'위기의 X E05.2022.1080P.H264-FOA.mkv',1351757810,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'위기의 X E04.2022.1080P.H264-FOA.mkv',1381413214,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'Amer.2010.720p.BluRay.x264-7SiNS.mkv',4685012318,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'Amer.2010.720p.BluRay.x264-7SiNS.nfo',1602,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'Harem (Arthur Joffé, 1985).mkv',3611319527,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'sample.mkv',31513565,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'Harem (Arthur Joffé, 1985).english-hearing impaired.srt',61468,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'Harem (Arthur Joffé, 1985).english.srt',38922,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'Harem (Arthur Joffé, 1985).chinese.srt',34903,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'SETUP/data.xp3',222597557,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'SETUP/rijishimai.exe',3463168,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'NemuAndHaruka.png',1338582,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'Setup.exe',411648,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'SETUP/plugin/krmovie.dll',225280,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',5,'SETUP/plugin/wuvorbis.tpm',200704,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',6,'SETUP/wuvorbis.dll',200704,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',7,'setup.jpg',131614,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',8,'SETUP/readme.txt',4064,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',9,'SETUP/メルマガ登録とユーザーアンケートのお願いについて.txt',579,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',10,'Setup.ini',577,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',11,'SETUP/Mielホームページ.url',139,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',12,'SETUP/Mielアンケートページ.url',100,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',13,'H-Game uploaded by baka girlcelly - NemuAndHaruka.txt',60,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',14,'girlcelly@[Anime-sharing.com].txt',60,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',15,'Autorun.inf',53,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'Archer.S08E01.No.Good.Deed.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',622459630,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'Archer.S08E02.Berenice.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',565411283,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'Archer.S08E03.Jane.Doe.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',588360130,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'Archer.S08E04.Ladyfingers.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',648678105,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'Archer.S08E05.Sleepers.Wake.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',635843012,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',5,'Archer.S08E06.Waxing.Gibbous.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',580366688,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',6,'Archer.S08E07.Gramercy.Halberd.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',652962033,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',7,'Archer.S08E08.Auflosung.1080p.NF.WEBRip.DD5.1.x264-CtrlHD.mkv',616335761,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',8,'RARBG.txt',31,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'Cruel.Intentions.1999.720p.Bluray.DTS.x264-RuDE (1).mkv',4676306394,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'sample.mkv',54550130,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'Cruel.Intentions.1999.720p.Bluray.DTS.x264-RuDE.nfo',4752,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',0,'HiHSP2201012 (1).htm',2053,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',1,'HiHSP2201012 (1).jpg',715664,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',2,'HiHSP2201012 (1).mht',357066,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',3,'HiHSP2201012 (1).mp4',1232258617,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',4,'HiHSP2201012 (1).png',5824,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',5,'HiHSP2201012 (1).url',188,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',6,'HiHSP2201012 (10).jpg',409670,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',7,'HiHSP2201012 (11).jpg',396074,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',8,'HiHSP2201012 (12).jpg',455877,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',9,'HiHSP2201012 (13).jpg',397495,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',10,'HiHSP2201012 (14).jpg',423372,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',11,'HiHSP2201012 (15).jpg',158312,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',12,'HiHSP2201012 (16).jpg',448633,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',13,'HiHSP2201012 (17).jpg',175492,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',14,'HiHSP2201012 (18).jpg',431885,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',15,'HiHSP2201012 (19).jpg',441465,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',16,'HiHSP2201012 (2).jpg',390016,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',17,'HiHSP2201012 (2).mp4',824466785,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',18,'HiHSP2201012 (20).jpg',105577,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',19,'HiHSP2201012 (21).jpg',441118,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',20,'HiHSP2201012 (22).jpg',149655,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',21,'HiHSP2201012 (23).jpg',1836852,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',22,'HiHSP2201012 (24).jpg',1931150,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',23,'HiHSP2201012 (3).jpg',393012,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',24,'HiHSP2201012 (4).jpg',167453,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',25,'HiHSP2201012 (5).jpg',414268,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',26,'HiHSP2201012 (6).jpg',390495,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',27,'HiHSP2201012 (7).jpg',423440,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',28,'HiHSP2201012 (8).jpg',139704,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',29,'HiHSP2201012 (9).jpg',431859,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('ef\ufffdh\ufffdm\ufffd\ufffd\ufffd0\\\ufffd\ufffd\ufffd\ufffd\ufffdWP\ufffd\ufffd',30,'_-_ HiHSP2201012  影檔名稱.txt',365,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',0,'1CA5X1RYK.jpg',236431,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',1,'_____padding_file_0_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',287857,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',2,'2.jpg',223235,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',3,'_____padding_file_1_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',301053,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',4,'8ecd0dd5e9a5bf54f703d52dc56d2fbbeb441625.jpg',85455,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',5,'_____padding_file_2_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',438833,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',6,'divxfactory-wisw13a.avi',732434432,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',7,'_____padding_file_3_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',520192,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',8,'divxfactory-wisw13b.avi',734070784,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',9,'_____padding_file_4_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',456704,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',10,'hlg2716@SexInSex! Board.url',250,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',11,'_____padding_file_5_如果您看到此文件,请升级到BitComet(比特彗星)0.85或以上版本____',524038,'2023-12-10 11:57:33.939','2023-12-10 11:57:33.939'),('<binary>',12,'[email protected]@22P2P.url',237,'2023-12-10 11:57:33.939','2023-12-1�

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.