-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rawpy questions (white balance; dcraw -A option) #12
Comments
Sure, no problem. The -A option seems to be supported by libraw ( |
Any news about this issue? |
I've developed a work around that seems OK. It works like this:
Important points here are that the user_wb multipliers are all set to 1.0, so no colour balancing will take place. The brightness is also set to 1, with no auto-brightness. I choose the LINEAR demosiac algorithm as it claims to be quickest.
This seems to work very well. After the second extraction, the selected grey area of the image appears and measures to be neutral grey. I hope this is clear - get back to me if it isn't! Steve Morton |
Great work, I think it's actually better if more of that custom functionality happens outside rawpy itself, since it will always have a limited set of base functionality only. |
I used another strategy: 1st step: 2nd step: 3rd step: Voilà. Those pre_mul values can be used with the -r keyword 👍 Hope that helps. |
Thanks for this – particularly the half size option, which I had forgotten about! It probably isn’t clear from my reply, but this whole colour balance process happens more or less automatically. The only user input required is, first, select the raw file to use for the white balance, and second, to select the neutral area in that file. The program of which this is a part is intended for processing raw files containing images of colour negative film – i.e I take a photo of the negative on a light box. I use the rawpy user white balance to compensate for the orange base colour of the film, so the colour balance image usually contains a section of clear film base. Once the multipliers have been determined, they can then be applied to the images from all the negatives from that particular film, so I write them into a small file in the same folder as the .dng’s for future use. As an aside, it is interesting how much difference there is between colour negative films, even those by the same manufacturer, and of the same type – no wonder dark room colour printing was so difficult! Once the multipliers for a particular film have been determined, the raw files from the whole film will be processed with the same values. I use a similar technique for determined the Brightness value – I use the ratio of the average of the green channel of the reference patch (which in my case is the film base) and calculate the Brightness parameter as 216-1/avgG ( for 16 bit images) i.e. the clear film base represents the brightest part of the (negative) image. Later in the program, after inverting the image, I set both the black and white points using actual image data. Finally, I adjust the final brightness of the image, and tweak the final colour balance, by applying user selectable gamma functions to each channel (i.e. px_out = (px_ingamma)*scaling_factor). Gamma values vary between 0.5 and 3.0, and numpy makes it almost trivial to use look up tables to apply the function to the image data. The program actually works extremely well. I import the final image (saved as tiff file) into Adobe Lightroom, where any final twiddles can be performed, but often none are needed. In parallel, I have created a similar program for handling monochrome negatives (of which I have several thousand, dating back to the 1960’s!), which also works well. I might get round to writing this all up properly on my currently moribund website at www.stevemortonphotography.co.uk, but I probably ought to redo the site first! Cheers Steve Morton |
I was wondering if one of you has a good test image I could use for a public ipython notebook where I would demonstrate the gray area selection and white balance calculations etc. I'm trying to get some examples together. If so, let me know (by email if you prefer) or just upload it somewhere. Thanks in advance! |
Hi Mark
I would be happy to let you have a couple of images to show how the work-around that I described earlier works, but do note that my application is pretty specialised, and only likely to be meaningful to someone trying to do the same thing!
It may take a short while to sort out some examples, and explain what my code is doing, and then I’ll put it on my Google Drive site and send you a link, if this is OK?
Best wishes
Steve Morton
From: Maik Riechert
Sent: Monday, May 9, 2016 11:46 PM
To: neothemachine/rawpy
Cc: Steve8650 ; State change
Subject: Re: [neothemachine/rawpy] rawpy questions (white balance; dcraw -A option) (#12)
I was wondering if one of you has a good test image I could use for a public ipython notebook where I would demonstrate the gray area selection and white balance calculations etc. I'm trying to get some examples together. If so, let me know (by email if you prefer) or just upload it somewhere. Thanks in advance!
—
You are receiving this because you modified the open/close state.
Reply to this email directly or view it on GitHub
|
That's perfect, thanks. The more specialized, the better. It's all about On 10/05/2016 14:17, Steve8650 wrote:
|
OK, here is a link: https://drive.google.com/folderview?id=0B_1stN2zxz2tSmtsOEN2RWJlXzg&usp=sharing I’ve tried to explain, succinctly, what it is all about in the file demo.docx, and I think I have included everything you need to actually run the program. Good luck! If you do produce a public notebook, could you send me a link? Also, any suggestions for improving the code would be welcome. Best wishes Steve From: Maik Riechert That's perfect, thanks. The more specialized, the better. It's all about On 10/05/2016 14:17, Steve8650 wrote:
— |
@Steve8650 I just finished the first version of the notebook. It doesn't go into as much detail as your document and code, but it illustrates the basics that are relevant to the use of rawpy. Let me know if you spot an obvious error or think it can be improved in any way (I'm sure it can!). I simplified the process in some ways (e.g. not using 16 bit images, and using the absolute black level instead of a percentile) but the main principles should be the same. |
Hmm, I was just wondering. Why is the separate blank film base actually necessary? I mean, there is also blank film at each colour negative at the bottom where the numbers and holes are. It's smaller, but couldn't you just use that? |
Maik That is really good! I hadn’t seen a Juypter notebook ‘in action’ before – impressive! I hadn’t considered using matplotlib to display the image – it might have been a bit easier than all that messing around with tkinter, although I would still have needed a user interface, I suppose. A couple of points:- Right at the start, you should mention that the images used are photographs (not scans) of the negatives, saved as raw files. I don’t see why a similar process wouldn’t work with a film scanner, but most of them come with commercial software to do the conversions. Using a normal DSLR to capture the original negative is quick and easy. At point 7, the reason that the green channel is used as the reference is that the DSLR sensor has twice as many green pixels as it has red and blue ones (the Bayer matrix), so that, in theory at least, it is the least noisy of the three channels. Cheers Steve From: Maik Riechert @Steve8650 I just finished the first version of the notebook. It doesn't go into as much detail as your document and code, but it illustrates the basics that are relevant to the use of rawpy. Let me know if you spot an obvious error or think it can be improved in any way (I'm sure it can!). I simplified the process in some ways (e.g. not using 16 bit images, and using the absolute black level instead of a percentile) but the main principles should be the same. — |
Maik
Yes, you are correct, but there are a couple of practical difficulties in using either the film edge or the inter-frane gap.
The film edge not only has the sprocket holes in it, but it is also used by the different manufacturers for recording a bunch of info, such as film type, version, frame number etc. If the original 35mm cassette and/or camera suffered from light leaks (and some of the negatives I’m working with do seem to have suffered this problem), it is usually the edges of the film that suffer most. If any of these are included in the reference area used to calculate the white balance and brightness parameters, then errors will be introduced for the whole film. Also, it is only recently that I started including the film edge in the negative capture, so that I could see the frame number – I have a lot of captures without film edges.
The inter-frame gap can be quite narrow, and sometimes the film is tilted slightly when the capture is made, so that it is sometimes difficult to select a suitable area. I use a negative carrier from an old scanner to hold the film for copying, which is masked to 24 x 36mm, so it isn’t possible to include the inter-frame gap with a normal exposure. If I have to make a special exposure, I will use clear film base if there is some available, otherwise I will choose the largest inter-frame gap on that film.
Finally, there is the theoretical argument that says that the larger the area you use to estimate the parameters, the more representative of the whole film the parameters will be. I doubt that this is important, though!
Hope this helps.
Cheers
Steve
From: Maik Riechert
Sent: Monday, June 13, 2016 12:55 AM
To: neothemachine/rawpy
Cc: Steve8650 ; Mention
Subject: Re: [neothemachine/rawpy] rawpy questions (white balance; dcraw -A option) (#12)
Hmm, I was just wondering. Why is the separate blank film base actually necessary? I mean, there is also blank film at each colour negative at the bottom where the numbers and holes are. It's smaller, but couldn't you just use that?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Maik
One other thing – you have linked to my web site (thanks!), but as you may have noticed, I haven’t been keeping the site up to date. Specifically, I don’t think there is any way someone can contact me directly via the site. Could you put my e-mail address (steve.morton8650@gmail.com) somewhere on the tutorial in case anyone does want to contact me?
Many thanks
Steve
From: Maik Riechert
Sent: Monday, June 13, 2016 12:55 AM
To: neothemachine/rawpy
Cc: Steve8650 ; Mention
Subject: Re: [neothemachine/rawpy] rawpy questions (white balance; dcraw -A option) (#12)
Hmm, I was just wondering. Why is the separate blank film base actually necessary? I mean, there is also blank film at each colour negative at the bottom where the numbers and holes are. It's smaller, but couldn't you just use that?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Maik
I hope you don't mind me contacting you again, but I have some questions about the use of rawpy. I'm used to using dcraw directly, and often use the command line option -A , which calculates the white balance from a selected area of the image, and then returns the white balance multipliers on STDERR, ready for use with the option -r for similar images. Rawpy seems to be able to use the -r option, via the user_wb parameter, but I can't find an equivalent of the -A option, nor a means of obtaining the multipliers for future use.
Have I missed something, or are these facilities not present (yet?) in rawpy?
Best wishes
Steve Morton
The text was updated successfully, but these errors were encountered: