Giter Club home page Giter Club logo

gopro's People

Contributors

danharvey avatar dustin avatar endgame avatar tymmej avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gopro's Issues

Sqlite database keeps record of a media even after deleting it via GoPro Cloud official web interface

I found a file I didn't want to keep in the GoPro Cloud official web interface. Therefore, I deleted it. Even after using 'gopro sync', I realised the database still keeps record of it.

I tried using 'gopro reprocess', even though it is not made for deletions (just in case), and it didn't fix it.

Firstly, is it somehow marked as deleted? Or is it a bug? Or maybe the tool is just simply thought to be always used with the custom web interface?

Secondly, do you think just dropping the record from the media table would work to fix it?

P.S. Thanks for publishing your amazing tool!!

Postgres's dependencies is missing

During the installation process, I discovered that the installation of Postgres was necessary. Although not a significant issue, I believe it would be beneficial to include this information in the readme.md file.

Specifically, the only requirement for installation is the libpq-dev package.

Internal Server Error using Docker image

Steps to reproduce:

$ docker run -it --entrypoint /bin/sh dustin/gopro:master
$ gopro auth
Enter email: [email protected]
Enter password:
# gopro sync -v
D: Reading auth token from DB
gopro: HttpExceptionRequest Request {
  host                 = "api.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("Accept-Language","en-US,en;q=0.9"),("Origin","https://plus.gopro.com"),("Referer","https://plus.gopro.com/"),("User-Agent","github.com/dustin/gopro-plus 0.6.0.3")]
  path                 = "/media/search"
  queryString          = "?fields=captured_at,created_at,file_size,id,moments_count,ready_to_view,source_duration,type,token,width,height,camera_model&order_by=created_at&per_page=100&page=0"
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
  proxySecureMode      = ProxySecureWithConnect
}
 (StatusCodeException (Response {responseStatus = Status {statusCode = 500, statusMessage = "Internal Server Error"}, responseVersion = HTTP/1.1, responseHeaders = [("Content-Type","application/json; charset=UTF-8"),("Content-Length","46"),("Connection","keep-alive"),("Date","Tue, 09 Apr 2024 21:06:59 GMT"),("Server","nginx"),("X-Request-Id","eac39c6e21170d95324ce85cb5d23f31"),("X-Runtime","0.021540"),("Vary","Accept-Encoding, Origin"),("Access-Control-Allow-Origin","https://plus.gopro.com"),("Access-Control-Allow-Credentials","true"),("Strict-Transport-Security","max-age=31536000; includeSubDomains"),("X-Cache","Error from cloudfront"),("Via","1.1 08c5e904e2f0226b2d9c1417f32b12f2.cloudfront.net (CloudFront)"),("X-Amz-Cf-Pop","ZRH50-C1"),("X-Amz-Cf-Id","u_3ctzxlejLhyrAsBAbk0FZpMYmGHtIHj1c9AbSwtgN7OO5pOQyEIg==")], responseBody = (), responseCookieJar = CJ {expose = []}, responseClose' = ResponseClose, responseOriginalRequest = Request {
  host                 = "api.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("Accept-Language","en-US,en;q=0.9"),("Origin","https://plus.gopro.com"),("Referer","https://plus.gopro.com/"),("User-Agent","github.com/dustin/gopro-plus 0.6.0.3")]
  path                 = "/media/search"
  queryString          = "?fields=captured_at,created_at,file_size,id,moments_count,ready_to_view,source_duration,type,token,width,height,camera_model&order_by=created_at&per_page=100&page=0"
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
  proxySecureMode      = ProxySecureWithConnect
}
}) "{\"status\":500,\"error\":\"Internal Server Error\"}")

Is it possible to serve content from local backup incase gopro cloud subscription runs out

First of all, thanks a lot for this epic tool.
It might have been answered but I was not able to find it, I want to use the tool but locally only going forward.
My gopro plus subscription has run out so I am just downloading all my data and planning to backup to a local medium going forward.
What would be the best way to tackle this. Did not knew what would be the best place to ask so thought of creating an issue. Please feel free to close it if does not fall into place.
Thanks again!!

Sync fails with Unauthorized 401 message

The app worked fine but after a few hours, the sync command is failing. I'm getting and Unauthorized 401 message.
I thought it might be a token generated during the authentication expired so I tried reauth command, auth to start over but none of them fixed it. I'm wondering if they updated the API or revoked the Api Key you are using.
Did you face the same issue before? Could it be a DDos protection mechanism?
It feels like it is Cloudfront that is blocking me, but I can't figure out why?

Unsupported Parser

Would really like an alternative to the unworkable GoPro App and I'm more than comfortable with CLI, but I don't know any Haskell so I'm stumped by this error.

Having followed the install instructions, have authenticated and running the first 'gopro sync' it worked it's way though more the 200 of the 244 items and now I get

gopro -v sync
D: Reading auth token from DB
I: 0 new items
I: Fetching meta 0
D: Need meta: []
I: Updating ("o6Ra0w75aMqnV",GPMF)
gopro: unsupported parser: 'S'
CallStack (from HasCallStack):
error, called at src/GoPro/GPMF.hs:129:20 in gpmf-0.1.0.3-350fHnVuGmDJDL1tyyYPbe:GoPro.GPMF

If you're able to shed any light on this so I can complete the sync that would be appreciated.

Thanks for sharing these tools

Web app with 'gopro serve' does not work

Hello! I started using your tool since I found it very useful, and trying to make it work.

When I use 'gopro serve', the web server starts but going to 'localhost:8008' only brings up 'Something went wrong' and the console displays 'static/index.html: withBinaryFile: does not exist (No such file or directory)'.

Do you know if it is a setup issue or a bug?

Windows build failure

Hi, sorry for this newbie question but I'm not so acquainted with the terminal and this kind of setup. Are there any simple steps I could take to solve these dependence errors?

image

Error after refresh web

**I was unable to load the media: bad body: Problem with the value at json[3].height:

null

Expecting an INT**

When syncing the first time the web showed some content, but after a while a refresh gives this message (got a couple of 100 items from 3119)
Is there a corrupt image in the gopro cloud, how can I find the one causing this problem of fix this?

gopro serve - web-server blocks when starting up

Hello and first thing first: thanks for the huge work. This tool could be very time-saving for a lot of people!

I'm working on a Mac M1 with Big Sur 11.0.1.

I've installed ffmpeg, and followed the instruction provided here. All seemed to have worked fine.
I've also added to path the gopro user directory, as suggested.

Unfortunately when I launch the webUI with gopro serve the terminal remains blocked with no other information (even with --verbose).

I: Starting web server

Syncing (even if not all files are read, maybe the ReelSteadyGO ones) seems to work fine.
I would like to use the webGUI to batch download date-filtered files

How could I fix it? Am I missing something?

Thank you so much

Thanks Dustin for the amazing work! Can't understand why I added only the 10th star...

Integrate with GoProX

Hi,

Would love to integrate with GoProX workflow.

Primarily to upload processed media to GoPro Plus. What would be the most lightweight way to make this happen (CLI only)?

Very much appreciate what you are doing!

GoPro Max files are not supported

Hi,

I want to sync my GoPro Max files with this program, but I get the following error.

I: Ignoring some unknown files: ["GS010128.1.360"]
I: Have 0 media items to upload in 0 parts with a total of 0 chunks (0 MB)

Stack install Error While building package postgresql-libpq

On commit 330b897 I get a stack install error:

Fedora 38 (6.5.8-200.fc38.x86_64)

postgresql-libpq             > configure
postgresql-libpq             > [1 of 3] Compiling Main             ( /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/Setup.hs, /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/Main.o )
postgresql-libpq             > [2 of 3] Compiling StackSetupShim   ( /home/relentless/.stack/setup-exe-src/setup-shim-Z6RU0evB.hs, /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/StackSetupShim.o )
postgresql-libpq             > [3 of 3] Linking /tmp/stack-b96ac213d83ea9ad/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/setup
postgresql-libpq             > Configuring postgresql-libpq-0.9.5.0...
postgresql-libpq             > Error: setup: The program 'pg_config' is required but it could not be found.
postgresql-libpq             >
crypton                          > copy/register
crypton                          > Installing library in /home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib/x86_64-linux-ghc-9.4.6/crypton-0.33-E0XHNcGjfq55myX2eFTwG6
crypton                          > Registering library for crypton-0.33..
cryptonite                       > copy/register
cryptonite                       > Installing library in /home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib/x86_64-linux-ghc-9.4.6/cryptonite-0.30-K750s6VpRDbLPZF227pMDg
cryptonite                       > Registering library for cryptonite-0.30..
Progress 93/252            

--  While building package postgresql-libpq-0.9.5.0 (scroll up to its section to see the error) using:
      /tmp/stack-9c92f5e8c1152d4e/postgresql-libpq-0.9.5.0/.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0/setup/setup --verbose=1 --builddir=.stack-work/dist/x86_64-linux-tinfo6/Cabal-3.8.1.0 configure --with-ghc=/home/relentless/.stack/programs/x86_64-linux/ghc-tinfo6-9.4.6/bin/ghc-9.4.6 --with-ghc-pkg=/home/relentless/.stack/programs/x86_64-linux/ghc-tinfo6-9.4.6/bin/ghc-pkg-9.4.6 --user --package-db=clear --package-db=global --package-db=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/pkgdb --libdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/lib --bindir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/bin --datadir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/share --libexecdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/libexec --sysconfdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/etc --docdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --htmldir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --haddockdir=/home/relentless/.stack/snapshots/x86_64-linux-tinfo6/4de6784bf5c40578d07337abf654ed2c3f592b74f2aab218a101c8a52d7bf209/9.4.6/doc/postgresql-libpq-0.9.5.0 --dependency=Cabal=Cabal-3.8.1.0 --dependency=base=base-4.17.2.0 --dependency=bytestring=bytestring-0.11.5.1 --dependency=unix=unix-2.7.3 -f-use-pkg-config --exact-configuration --ghc-option=-fhide-source-paths
    Process exited with code: ExitFailure 1

Error 422 during upload

Hello,

I have a problem. During my upload process, I get the following error:

StatusCodeException (Response {responseStatus = Status {statusCode = 422, statusMessage = "Unprocessable Entity"}

and direct under that, there is another one:

"{"_errors":[{"code":6105,"description":"cache value is nil for this request: {DerivativeID:2381467954777687236 UploadID:n3sa.1SZig0dRoUoyUFNhsNgD1UTCqrywDS3IvDAWzH4ZwWunImUIcvdY7leQa0XYWvJjyiUIjqmv4yLzqoOoXSD.gui7WsTPmr9xc1AKTTe7mXDe7HjUSdTbwydRVqy ItemNumber:1 CameraPosition:default TranscodeSource: FileSize:2732485278 PartSize:6291456 Page:1 PerPage:435 _:0}"}

do i miss something?

Consider the possibility of downloading only originals

Hi,

I'm using your fabulous program to create a local backup from my gopro+ data on a Synology NAS. Everything works perfects so far. But as the program downloads every version of the GoPro+ content, the downloaded data is quite large.

I'd prefer to only download the original files, ignoring the proxy files or thumbnails. Is this thinkable?

Thanks for your work!
Michael

Sync Issue

Received the following error that would halt sync from continuing.

gopro: SQLite3 returned ErrorError while attempting to perform prepare "insert into files (media_id, section, label, type, item_number, file_size) values (?, ?, ?, ?, ?)": 5 values for 6 columns.

I was able to get it running by simply adding an additional arg (?)

Troubleshooting Request - metadata syncing to bucket but not media

1st, thank you for this. I was looking for a solution to backup gopro videos from their portal directly to s3.

I believe I've setup lambda, sqsqueue, and bucket as required with appropriate perms. When running a gopro backup, i appear to be pushing all metadata to the s3 bucket, but no videos.

The tool is looping with this status:
I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying I: Processing 0 responses I: Waiting for 35 files to finish copying

My function logging indicates the download req is being made:
image

SQSQueue is only showing empty recieves.
image

Wondering if this info may be able to provide a quick pointer on how to dig further. Any assistance appreciated - thank you for the tool!

Could not find gpmf-0.1.1.1 on Hackage

Following the installation instructions today I got the following error on stack install

_@_:~/gopro$ stack install
Cabal file info not found for gpmf-0.1.1.1, updating
Selected mirror https://hackage.haskell.org/
Downloading timestamp
Waiting to acquire cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
Acquired cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
Released cache lock on /home/_/.stack/pantry/hackage/hackage-security-lock
No package index update available and cache up to date
Package index cache populated
Could not find gpmf-0.1.1.1 on Hackage
Perhaps you meant hmp3, html, time, stm, pqc, gd, exif, Hmpf, hpc, or Diff?

I installed stackfrom scratch, after also having failed in the same way for package installed stack

Be able to pull only the meta data

Hi Dustin,
I would like to use your library to develop a desktop app like google drive, but for GoPro cloud.
Because I have way to much videos to be able to download them all.

What I mean by that is I would like to be able to sync my videos without the need to download them.
I want to see all the videos from the Gopro cloud on my local machine based on the meta data only, and be able to download an existing video or upload new ones.

But unfortunately your tool do not let me download the meta data only.
Is there a way I could achieve my goal using your tool?

Thank you!

GoPro Upload Issue: Excessive Requests, Directory Restructuring, and Non-existent File Errors

Description:

I have been using an application to upload a large number of files from various sources to the GoPro cloud. This application has been a godsend, helping me consolidate and upload files that were scattered across multiple clouds and drives, some of which I was unsure if I had already uploaded to the GoPro cloud. However, I have encountered several issues and have had to create scripts in Haskell to support my use case and extend the functionality of the application.

Background:

I have been trying to rectify a lack of system organization dating back to 2017. I have a large number of files and exports scattered across different clouds and drives, and I am trying to ensure that all these files are backed up to the GoPro cloud for peace of mind in case of HDD failure.

Enhancements Made:

  1. Developed a Haskell script to automate the GoPro upload process from a shell script on Mac. The script notifies the user upon completion and triggers an alert that requires user interaction to finish, ensuring a smooth workflow.
  2. Created a validation script to ensure upload paths do not contain strings, which can be problematic when sourcing from Google Drive due to the unremovable space in "My Drive".
  3. Wrote a script to flatten directory's nested folders, addressing the issue of previously unexplainable organization methods.
  4. Developed a script to partition a directory into subdirectories of 30 files each, as GoPro seems to prefer this number and it matches the maximum number of notifications shown on the status upload page when a duplicate is encountered.
  5. Extended GoPro upload functionality with a script that accepts nested directories and executes uploads on each directory sequentially.
  6. Implemented a sync script to log out to a specified directory.

Current Problem:

After restructuring directories to flatten all files and then partition them back sequentially by filename into chunks of 30, I encountered an issue. The GoPro upload command is failing with an error stating that the file it's trying to finish does not exist. This is presumably because the file was moved or deleted during the partitioning process. This issue arose when I attempted to upload around 250 files out of 4TB of footage.

Potential Solution:

I am considering running the GoPro cleanup command to stop expecting uploads from the desktop. However, I am concerned that this might also stop and close files that have finished the uploading phase and are still in processing. A safer alternative might be to inform the server to stop waiting for an upload that is no longer coming, while ensuring that the server continues to process and finish any files that are currently in a processing state. I am looking for confirmation or a workaround for this issue.

Individual/subset media download command

Introduce a similar to backuplocal, but allow specifying the media to be downloaded.

Specification for downloads may be medium IDs or perhaps some kind of query (e.g., time relative or maybe even a small expression language).

gopro sync fails

command gopro sync fails with Error below .
At the same time gopro upload works fine.

I: 6028 new items
I: Storing batch of 100
gopro: HttpExceptionRequest Request {
  host                 = "images-02.gopro.com"
  port                 = 443
  secure               = True
  requestHeaders       = [("Content-Type","application/json"),("Accept","application/vnd.gopro.jk.media+json; version=2.0.0"),("Authorization","<REDACTED>"),("User-Agent","github.com/dustin/gopro 0.1")]
  path                 = "/resize/450wwp/qweyJhbGciOiJIUzI1NiJ9.ehgggyJtZWRpdW1fasssWQiOiIxOTU1Mzc1MTc2ODIwNDU4NTM2Iiwib3duZXIiOiI1NDVlYzZiYS03YWJlLTQ0YTctOTZkNy1iMTFlYTQxN2Q0NTUiLCJpc19wdWJsaWMiOmZhbHNlLCJvIjoxLCJ0cmFucyI6bnVsbCwicmVnaW9uIjoidXMtd2VzdC0yIiwidGh1bWJuYWlsX3VwZGF0ZWRfZGF0ZSI6bnVsbH0.3k-ro9BJl5OK-Z1duI4xLT6qCMd6KbUWuwuBAMEDRm4"
  queryString          = ""
  method               = "GET"
  proxy                = Nothing
  rawBody              = False
  redirectCount        = 10
  responseTimeout      = ResponseTimeoutDefault
  requestVersion       = HTTP/1.1
}
 (StatusCodeException (Response {responseStatus = Status {statusCode = 404, statusMessage = "Not Found"}, responseVersion = HTTP/1.1, responseHeaders = [("Content-Type","image/jpeg"),("Content-Length","272"),("Connection","keep-alive"),("Date","Wed, 22 Sep 2021 08:02:21 GMT"),("Server","nginx"),("Vary","Accept-Encoding, Origin"),("Access-Control-Allow-Credentials","true"),("Strict-Transport-Security","max-age=31536000; includeSubDomains"),("X-Cache","Error from cloudfront"),("Via","1.1 2f194b62c8c43859cbf5af8e53a8d2a7.cloudfront.net (CloudFront)"),("X-Amz-Cf-Pop","FRA2-C2"),("X-Amz-Cf-Id","Um2L9ajXeqJdT88GlVNQnGzyQQ7TfOdSpwqJT61ZHQKX3uKB-xuO0g==")], responseBody = (), responseCookieJar = CJ {expose = []}, responseClose' = ResponseClose}) "\255\216\255\224\NUL\DLEJFIF\NUL\SOH\SOH\NUL\NULH\NULH\NUL\NUL\255\219\NUL\132\NUL\SOH\SOH\SOH\SOH\SOH\SOH\STX\SOH\SOH\STX\ETX\STX\STX\STX\ETX\EOT\ETX\ETX\ETX\ETX\EOT\ACK\EOT\EOT\EOT\EOT\EOT\ACK\a\ACK\ACK\ACK\ACK\ACK\ACK\a\a\a\a\a\a\a\a\b\b\b\b\b\b\t\t\t\t\t\v\v\v\v\v\v\v\v\v\v\SOH\STX\STX\STX\ETX\ETX\ETX\ENQ\ETX\ETX\ENQ\v\b\ACK\b\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\v\255\192\NUL\DC1\b\NUL\n\NUL\n\ETX\SOH\"\NUL\STX\DC1\SOH\ETX\DC1\SOH\255\196\NULL\NUL\SOH\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\a\DLE\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\SOH\SOH\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\b\t\DC1\SOH\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\NUL\255\218\NUL\f\ETX\SOH\NUL\STX\DC1\ETX\DC1\NUL?\NUL\155\128v%[\255\217")
root@005ceca2fddf:/mnt/Scans# gopro sync
I: 6028 new items
I: Storing batch of 100

gopro: Prelude.read: no parse

First of all, thank you for creating this tool!
It saved me from having to manually download each video from the cloud, now that I'm about to cancel the subscription.

Just wanted to flag an issue to you, the gopro sync and gopro fetchall commands were failing with gopro: Prelude.read: no parse

I eventually figured out the list of all media and ran the refresh command on each of them, after that, I was able to download them.

A specific issue example:

gopro refresh xxxxx
I: Processing batch of 1
gopro: Prelude.read: no parse

Not sure exactly why it's failing, but my library of 65 items had 2 errors.

Perhaps if you added a printout of the response from the API request into the verbose option, it would be easier to figure out as adding -v currently doesn't do much.

I've got a month left on my subscription so happy to help you figure out why the error happens before all my content is deleted from there.

Regards
Matt

Installing on Mac M1

During stack install I get same issue as in https://gitlab.haskell.org/ghc/ghc/-/issues/20592

I see that it is fixed on newer GHC version so I pass --resolver ghc-9.2.4 (I also have tried nightly).

Then I get multiple errors about dependencies:

Error: While constructing the build plan, the following exceptions were encountered:

In the dependencies for aeson-2.0.3.0:
[...]
In the dependencies for x509-validation-1.6.12:
    asn1-encoding must match >=0.9 && <0.10, but the stack configuration has no specified version  (latest matching
                  version is 0.9.6)
    asn1-types must match >=0.3 && <0.4, but the stack configuration has no specified version  (latest matching version
               is 0.3.4)
    x509 must match >=1.7.5, but the stack configuration has no specified version  (latest matching version is 1.7.7)
needed due to gopro-0.1.0.0 -> x509-validation-1.6.12

Some different approaches to resolving this:

  * Recommended action: try adding the following to your extra-deps in /Users/xxx/code/gopro/stack.yaml:

- RSA-2.4.1@sha256:b52a764965cd10756646cc39eadcbc566e131181a75f2a13c621697f4b06d76b,2467
- ...

So I do as recommended.

Finally, I get following error:

gpmf > Building executable 'gpmf' for gpmf-0.1.1.1..
gpmf > [1 of 2] Compiling Main
gpmf >
gpmf > /private/var/folders/k7/s7p_419d4x91m_mbj6ggxtqw0000gn/T/stack-b3826990cd58aa9d/gpmf-0.1.1.1/app/Main.hs:14:24: error:
gpmf >     Not in scope: ‘BL.putStrLn’
gpmf >     Perhaps you meant one of these:
gpmf >       ‘BL.putStr’ (imported from Data.ByteString.Lazy),
gpmf >       ‘BS.putStr’ (imported from Data.ByteString)
gpmf >     Module ‘Data.ByteString.Lazy’ does not export ‘putStrLn’.
gpmf >    |
gpmf > 14 |   either print (mapM_ (BL.putStrLn . maybe "" showDEVC . uncurry mkDEVC)) . parseGPMF =<< BS.readFile fn
gpmf >    |                        ^^^^^^^^^^^
Progress 1/2

Is it possible to use this script on Mac OS with M1 processor?
I installed it on x86_64 Linux without problem but I prefer Mac OS.

JSONError Unexpected MediumType Audio

While running gopro sync I received the following error.
JSONError "Error in $.type: Unexpected MediumType: \"Audio\""

I found that issue #14 was another Unexpected MediumType that was easily resolved.

I see that the error is thrown in dep pkg here: https://github.com/dustin/gopro-plus/blob/master/src/GoPro/Plus/Media.hs#L115 although I don't believe a distinctive Audio MediumType would fall into the photo "bucket".

I recently did make a GoPro edit on mobile using a local soundfile so I'm supposing this is what's being seen all of a sudden.

Request: docker container

Thank you for the great contribution.

However, I really need this to be in a docker container to make use of it in my environment.

Could I please ask you put together an image? It would increase adoption, no doubt.

Reuploading restarts the whole process

I am using the software through docker. As I have recently subscribed to the plus I am trying to upload my existing library I have stored locally. When I upload the files it works fine until there is an error (typically timeout). Now if I again run the upload it will restart the whole process by creating new upload IDs and then start the process all over again including the files that were successfully uploaded instead of continuing the existing remaining unfinished uploads. I assume it can be due to me running through Docker.

Using the following command:
docker run --interactive --volume $PWD:/usr/lib/gopro --volume "path/to/the/media:/data" --rm dustin/gopro:master gopro upload /data -v

Could it be other files that should be kept persistent for it to keep the current upload status?

Unexpected MediumType: "MultiClipEdit" while performing first sync

I've installed it in a Raspberry, and I'm trying to run it for the first time.

I've performed the auth steps and now I'm trying to run the sync command for the first time and it's failing with the following error:

$ gopro sync -v
D: Reading auth token from DB
gopro: JSONError "Error in $.type: Unexpected MediumType: \"MultiClipEdit\""

The same error happens when trying to run fetchall.

It's a type of Media that was created using GoPro app and was available in the GoPro Media Library, filtering by Edits. I've already deleted it from there but it still fails.

Please let me know if there are additional ways to debug the issue.

failed to parse field 'extra-deps' when installing

I get this:
jaw@wormnethub:~/gopro$ sudo stack install Could not parse '/home/jaw/gopro/stack.yaml': Aeson exception: Error in $['extra-deps'][1]: failed to parse field 'extra-deps': (Invalid package identifier: "crypton-0.33@sha256:5e92f29b9b7104d91fcdda1dec9400c9ad1f1791c231cc41ceebd783fb517dee,18202","crypton-0.33@sha256:5e92f29b9b7104d91fcdda1dec9400c9ad1f1791c231cc41ceebd783fb517dee,18202") See http://docs.haskellstack.org/en/stable/yaml_configuration/
What to do? I'm on ubuntu 18.04 LTS

Clearing error trying to upload after using createupload command

First of all, thanks for the effort to build this tool I have a back catalog of originals I want to upload to GoPro cloud so I've been trying to get my head around all the commands on my Mac.

I tried using the createupload command to queue up some uploads but now when I run upload there is an error.

I: Have 71 media items to upload in 71 parts with a total of 3180 chunks (19080 MB)
I: Uploading "KRdpO2eB7E3Zp" in 1 chunks (6 MB)
gopro: <file with spaces>.jpg: getFileStatus: does not exist (No such file or directory)

Even if I try to use the upload command and supply a different path of files it still fails with the same error.

Is there a way to clear the queue and try createupload again?

Lastly, is there a way to either use upload or createupload and choose to include sub directories (recursive)? My footage is in folders like:

/Volumes/WD/2019/
 -> 2019-09-21/HERO4 Silver 1
 -> 2019-09-22/HERO4 Silver 1
 -> 2019-09-23/HERO4 Silver 1
 -> 2019-09-24/HERO4 Silver 1

It seems as though I have to enter the path name of the actual GPro MP4 files.

I was hoping for something like:

gopro createupload /Volumes/WD/2019 -R
gopro upload

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.