Giter Club home page Giter Club logo

Comments (28)

JamesH65 avatar JamesH65 commented on September 10, 2024

This is on my list of stuff to look at. If I remember the code correctly (and it's a nightmarish and very large bit of code to handle all the exposure modes) this won't be an simple thing to do.

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

If there were some command to replace the 1/100 s that is used for --exposure=off, then that would probably be enough.

(But you know the code and I can see how that might screw up the timing for other modes, e.g. if that parameter is also the initial guess that is used in autoexposure.)

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

Turned out not to be so bad. I've made exposure a command line parameter, it's currently going though merging, but because it needs changes on GPU and ARM side, it will take a few days to get through the system.

There is an issue with exposures > about 1/3rd of a second - they lock the GPU - this was not introduced by this change, but has always been there. I'm investigating, but so far looks like a driver or perhaps even a sensor problem. I've contacted Omnivision to see if they can help because everything in the register setting looks right to me!

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

As an astronomer, I am quite interested in exposures > 1/3 second, so I thank you for pushing this through.
Is the exposure time given in e.g. row times, and the problem occurs at a 2^n boundary, or is it something more random?

(There are limits due to dark current, but modern CMOS imagers should be able to handle seconds of exposure even at room+ temperatures, and I am willing to strap an ice-cube to the sensor to get longer exposures.)

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

I've checked the rows and suchlike - there seems to be no obvious reason
why it fails with the particular value it does - no boundary or similar -
hence requesting help from Omnivision. Unfortunately the FAE I work with is
a bit busy at the moment.

But it is a specific point it fails, so they should be able to either
replicate or not fairly quickly.

On 2 October 2013 23:55, David Palmer [email protected] wrote:

As an astronomer, I am quite interested in exposures > 1/3 second, so I
thank you for pushing this through.
Is the exposure time given in e.g. row times, and the problem occurs at a
2^n boundary, or is it something more random?

(There are limits due to dark current, but modern CMOS imagers should be
able to handle seconds of exposure even at room+ temperatures, and I am
willing to strap an ice-cube to the sensor to get longer exposures.)


Reply to this email directly or view it on GitHubhttps://github.com//issues/76#issuecomment-25584177
.

from userland.

HeatfanJohn avatar HeatfanJohn commented on September 10, 2024

@dmopalmer, just curious, how do you interface the Pi's camera to your telescope? Do you have any blog postings that detail how you are doing that?

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

@HeatfanJohn, I will be unscrewing the lens from the camera module, and 3d printing a mount that has a tube that goes into the eyepiece tube. The pixel size on the camera module is a good enough fit for the prime focus of my 5" Celestron. Eyepiece projection would allow more flexibility, but I may as well start easy.

I will eventually also print up something that holds the Pi and a battery to give a single unit that works as a web-enabled eyepiece (using my piCamServer ) so I can view it from an iPad without any cables.

from userland.

dfrey avatar dfrey commented on September 10, 2024

Hi David,
I am really interested in your plan for prime focus. Are you planning to share your plans for the mount?

Thanks

Davide

On Oct 5, 2013, at 3:35 AM, David Palmer [email protected] wrote:

@HeatfanJohn, I will be unscrewing the lens from the camera module, and 3d printing a mount that has a tube that goes into the eyepiece tube. The pixel size on the camera module is a good enough fit for the prime focus of my 5" Celestron. Eyepiece projection would allow more flexibility, but I may as well start easy.

I will eventually also print up something that holds the Pi and a battery to give a single unit that works as a web-enabled eyepiece (using my piCamServer ) so I can view it from an iPad without any cables.


Reply to this email directly or view it on GitHub.

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

When I start printing the mount, I will put it on Thingiverse and/or elsewhere (most likely on github along with any astronomical imaging software I develop for it. Lucky imaging is one of the things I want to try with it.)

I use OpenSCAD for 3D design, so I will make the draw tube diameter a parameter (0.965" on mine, but 1.25" and 2" are common and parameterization means that doesn't have to be a common size). If i get fancy with internal baffles I can set the minimum f-number as an additional parameter. Do you have a specific optical configuration I should try to make it compatible with?

(Some of this is that when you have a hammer, everything looks like a nail. If I didn't have a 3D printer I'd put something together with tubing from the hardware store and epoxy in an hour. If I had a machine shop and the appropriate skills I'd use a lathe and be done in an afternoon, unless it had a CNC mill in which case I could spend weeks getting it right.)

But I'm not ready to take the lens off the camera module yet because it's useful for debugging to have simple optics so I can just stand in front of the camera and wave my arms to test things out.

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

@HeatfanJohn @dfrey
I have just released the design for the telescope interface to GitHub (its official home) and Thingiverse (which I consider a convenience copy, with a few more pictures).
https://github.com/dmopalmer/PiPiece
http://www.thingiverse.com/thing:181310

The field of view is quite small, as you would expect from the size of the imager chip. The Thingiverse picture of the raven head filing the frame was shot from ~40 meters away on a 5" telescope. (The Moon was shot from 10 million times further away and it was still much bigger than the FOV. From this we reach the scientific conclusion that Raven could not steal the Moon--unless he was very tricky.)

Whenever the exposure changes get folded into the closed-source component of the Pi, I can start doing astronomy.

from userland.

maxhal avatar maxhal commented on September 10, 2024

Hi James, any news/feedback about this problem from omnivision?

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

I have info from onmivision but not got te time to invrstigate the issue. It should work...

from userland.

coolcrab avatar coolcrab commented on September 10, 2024

I modded my pi for astronomy too. So it would be great to integrate for a long time!
So much potential and pretty pictures with this thing.

Any news yet? :P

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

Got it up to about 2s, but failed to get it working with anything longer. Because of the way things work, over 2s needs a change to the driver to work in a different way, and I'm not sure the layers above can cope with it. No more time at the moment to spend on this.

from userland.

coolcrab avatar coolcrab commented on September 10, 2024

Could it be possible to opensource the whole thing? Then others could try :P
There is no real rush, but I'd love to see this happen.

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

No, I'm afraid all this code is on the GPU which is closed source. It's
also horrible complicated.

On 27 November 2013 22:40, coolcrab [email protected] wrote:

Could it be possible to opensource the whole thing? Then others could try
:P
There is no real rush, but I'd love to see this happen.


Reply to this email directly or view it on GitHubhttps://github.com//issues/76#issuecomment-29426265
.

from userland.

astrorafael avatar astrorafael commented on September 10, 2024

Pehaps it would be possible if some frame averaging could be done on the GPU memory in steps of 2 secs.

from userland.

jdunmire avatar jdunmire commented on September 10, 2024

Did the 2-second support make it into the source? The docs still say 330000 uSeconds, but I didn't see a range check in the code.

from userland.

digitalspaceport avatar digitalspaceport commented on September 10, 2024

I dont think the 2 sec exposure markup made it into the release, as he indicated a change to the driver as well above. I would love to fork that base for us astronerds! 2 seconds is a pretty fair improvement over 1 sec when it comes to planetary work and getting ISO's trimmed.

from userland.

Ruffio avatar Ruffio commented on September 10, 2024

@dmopalmer @jollyjollyjolly is this still an issue?

from userland.

dmopalmer avatar dmopalmer commented on September 10, 2024

I have not looked at the source code recently, but as of raspistill v.1.38
$ raspistill -o /tmp/foo.png --exposure verylong
gives a 1/15 s autoexposure, even when the camera is in a light-tight box

Reading #151 I see that the correct option is '--shutter' . When I use --shutter 5000000 I get an image with blown-out exposure and the EXIF indicates a 5 s exposure time, so it seems to be working.

It takes 36 s to actually get a picture this way; presumably the system is trying to find a way to make the exposure look good. I haven't read the docs recently, so I don't know if there is a a way to tell raspistill to 'set the camera up like so, then just take the picture.'

But anyway, it does not seem to be an issue.

from userland.

tejonbiker avatar tejonbiker commented on September 10, 2024

Hi, I noticed the whole time that takes the -ss to 6000000 (Max value), but I don't know where the camera start to record, as far I can see, the camera waits 6 seconds for capture something, later take another capture of 6 seconds (this image is showed in the preview image in a HDMI monitor), later wait another 6 seconds and start to record the ultimate image (another 6 seconds), There's any way to know where the camera start to record the ultimate image?

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

There are some threads on the forum which cover exactly this topic of using
long exposure times.

On 28 August 2015 at 06:56, tejonbiker [email protected] wrote:

Hi, I noticed the whole time that takes the -ss to 6000000 (Max value),
but I don't know where the camera start to record, as far I can see, the
camera waits 6 seconds for capture something, later take another capture of
6 seconds (this image is showed in the preview image in a HDMI monitor),
later wait another 6 seconds and start to record the ultimate image
(another 6 seconds), There's any way to know where the camera start to
record the ultimate image?


Reply to this email directly or view it on GitHub
#76 (comment)
.

from userland.

Ruffio avatar Ruffio commented on September 10, 2024

@dmopalmer @jollyjollyjolly is this still an issue?

from userland.

bbozon avatar bbozon commented on September 10, 2024

I'm still having the issue that the camera takes ~40 seconds for a 6 seconde exposure. Does anybody know how to fix this?

@JamesH65 , do you know the status?

Thanks in advance!!!

from userland.

JamesH65 avatar JamesH65 commented on September 10, 2024

@6by9 Cannot remember what we suggest as a solution for this.

from userland.

6by9 avatar 6by9 commented on September 10, 2024

It's almost certainly just mode switching, so pretty much inherent. 40 seconds sounds high though - I thought it was around 24.

Two constraints:

  • whenever the sensor starts streaming it produces one corrupted frame.
  • whenever the GPU has requested a frame, it completes it.

In starting raspistill, regardless of the -t setting, it starts preview and requests a frame. Both the corrupt frame and preview frame will require the exposure time of 6 seconds. T=12secs now.

Assuming the capture has been requested by 12secs, then the sensor gets stopped, reprogrammed for the capture mode, and restarted. Starting the sensor means dropping a frame, so another 12 secs.

If burst mode hasn't been requested then as soon as the capture is completed it will mode switch back to preview and request a preview frame. I thought that shutting down would abort that frame, but I suspect that it may still complete it and taking another 12seconds.

There are a couple of things that could be checked to improve this, but it's not a priority at the moment.

from userland.

bbozon avatar bbozon commented on September 10, 2024

It works!!! Thank you very much. I forgot to put
-bm
-ex off
-tl 0

Now it works!
Sorry for the trouble...

from userland.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.