Change Camera Settings Dynamically - Multiplatform

I'm trying to find a good way to alter the camera settings on the fly (focus, white balance, resolution, etc) that will work in multiple platforms (rpi, windows, mac). Currently I am able to change settings, but only when using mjpegstreamer with the control.htm page enabled (requires changes to octopi.txt).

What I would like to do is to detect the streaming server type (yawcam, mjpegstreamer), and to adjust the settings without access to any special pages (control.htm), maybe through a combination of bash/bat scripts, or via some 3rd party library. I also would like this to support multiple cameras if possible, so it would be great if I could provide a list of attached cameras.

Any ideas?

Thanks!

I never checked, but do those webcam servers maybe send a usable Server http header that could be used for detection here? A simple HEAD request should already help in such a case.

I never checked, but do those webcam servers maybe send a usable Server http header that could be used for detection here? A simple HEAD request should already help in such a case.

I just checked, and mjpegstreamer at least does in fact return the following value for server for both snapshots and control.htm access: MJPG-Streamer/0.2

Awesome! Why didn't I think of that :slight_smile:?

Any ideas on how to work around the problem of control.htm access? Is there a way to make it accessible locally by default instead of blocking control.htm? Most users who have asked have been able to edit octopi.txt successfully, but I'm guessing many other users try it and ask no questions when it fails. I could pretty easily detect this issue when requests sent to control.htm return 404 or something similar, and suggest a fix or link to a help file. Additionally, even when control.htm is available, there is still the problem of unauthenticated access, though this could be mitigated if only local access is available.

I've not yet even tried yawcam, so I'll try to configure this and see what I can find.

Thanks for your helpful suggestions!

I was searching for something like that : changing camera setting dynamically.

many question to come as I'm also working with octolapse recently.

Seems that control.htm is some key and many parameter works fine, unfortunately none with raspi.

about raspi, I have it to work with weird non-standard resolution. and my question is about resolution and fps.

Can this be setted somewhere "on the fly" without the need of reboot ?

another point is : is there a way to access separate resolution stream from the camera, as raspicam should ba able to provide 3 type of stream (stills, preview and encoding) as read in

So in this case we can use "stills" mode to make time lapse in a different resolution ?

Cool ideas. So, as things stand, this is something that a streaming server could implement. Also, fyi, mjpgstreamer has stubs for changing the resolution. I think someone would just need to implement it. The same is true for the stills mode. If all that could be accomplished via an http get/post request we'd be golden (http://IP_ADDRESS/webcam/?action=snapshot&resolution=DESIRED_RESOLUTION_HERE)

not sure about my understanding of this :

as it's about mjpg compilation option WebcamXP to mjpg-streamer.

I'm looking for infos about the

in the code trying to figure out what and how this works
(still in hope of the easy trick as parameter)

still in hope of the easy trick as parameter

So, you can get a still frame from the mjpgstream today like this:

http://IP_ADDRESS/webcam/?action=snapshot

Here is that syntax in action:

image

BUT, getting a still frame from a resolution that is different from the camera stream is NOT possible. It would require someone to program this into mjpgstreamer. It would be awesome, but would not be 'easy' at all.

I'm in the same unfortunate conclusion. but, I've see something in the mjpg-streamer code that make me feel optimistic about. mjpg doesn't seem to encode video as ffmpeg do. My understanding is that it take jpg "frame" in a buffer, then write them in the output pipeline using MMAL (in raspicam) (look a bit different in UVC V4L2 driver).

so I guess one should be able to insert another "frame" trhu the preview port at full raspicam res.
seems ports are limited in sharing their access.
I would understand while encoding video but why these port aren't shared while snapshoting a JPG flow and vending it as a 100% use of MMAL port.

these are just guesses for now and I need to continue reading.
I'm hanging on despite it being way beyond my skills.

we learn it every day.
motivated as I'm intimately convinced I'm on a good clue.

the funny is that "select-resolution" is a commented section of the control.htm code
as read in

the funny is that "select-resolution" is a commented section of the control.htm code
as read in

That's what I meant earlier in this thread where I said:

Also, fyi, mjpgstreamer has stubs for changing the resolution.

It was thought about, started, but not completed. All it would take is for a resourceful dev to fork the repo, make the change, and submit a pull request.