Hi Manuel
In AeEnabled mode, I don’t think you can always obtain an AnalogueGain of 1. The distribution between ExposureTime and AnalogueGain is determined in the tuning file according to the exposure mode. So you need to let the algorithm do its job, and increase lighting if necessary if AnalogueGain is always too high.
On the other hand, unlike picamera, you can obtain the request metadata without Jpeg encoding, which is sufficient, faster and interesting for determining when the AGC/AGE algorithm has converged and also for HDR when you want to reach a set of fixed exposure:
metadata = picam2.capture_metadata()
You can also obtain image and metadata without Jpeg or DNG encoding, as it is desirable to do this in one or more threads or processes.
request = picam2.capture_request()
image = request.make_image(“main”) # image from the “main” stream
…queue to a thread
metadata = request.get_metadata()
request.release() # requests must always be returned to libcamera
The question of mutithread or multiprocess needs to be studied carefully. In your program, you use Herman’s idea of a pool of threads to send images to the socket. In my opinion, this serves no purpose at all, as there’s only one socket protected by a Lock, so a single Thread would do just as well!
So the simplest and probably sufficient one Thread for Jpeg or DNG encoding and sending to the socket, then if that’s not enough, a pool of Threads for encoding and a Thread for submitting to the pool, retrieving in order and sending, and then if that’s still not enough, do it in Processes and not Threads.
Finally, note that for HDR, David Plowman’s trick is really interesting: for example, in full res, you can get 5 exposures in just 15 requests…
Finally, if possible, you should absolutely have at least two request buffers.
Regards
Dominique