Different camera models are interesting field of research. For example NASA uses CAHVORE model for scientific cameras to correct for all sorts of distortions, and I think OpenCV has also it's own models. And now we have this novel model of lens blurs that can be added to the mix.
This is all very relevant if you want to do e.g. 3d reconstruction or view synthesis from some images, I imagine you can do much better the better knowledge you have of the camera.
mcdeltat 3 hours ago [-]
I always wondered why lens blur is considered hard computationally. Lens mathematics seems pretty well understood given that we can create quite complex lens designs with incredible performance (take a look at modern DSLR lenses, they often have 10+ elements). And in general blurs (e.g. Gaussian) are not complex algorithms.
Are there situations where lens blurs are easier/harder? I heard for 2D images it's hard to add blur - seems true because most smartphone artificial bokeh is horrible despite significant effort there. Presumably because depth information is missing? Is it easier for raytraced 3D renders?
meindnoch 2 hours ago [-]
Lens blur without a depth map is an ill-posed problem. So the computation goes into faking depth information somehow.
zipy124 2 hours ago [-]
It is exactly that. Blur is a function of depth related to the focal distance, so without depth you cannot "blur".
AlecSchueler 2 hours ago [-]
You've got it exactly. It's difficult to recalculate depth when you only have two dimensions. In a 3D environment it's as simple as you would expect.
hengheng 3 hours ago [-]
I can't imagine there being enough information for true fingerprinting of individual devices. With ten million iPhones being made per month, surely the blur patterns have to have some overlap?
Daub 4 hours ago [-]
I have always found it odd how in VFX we spend a lot of time degrading our perfect 3D renders: motion blur, film grain, sensor noise and lens blur, which I would call a defocus. I am interested in the application of this research, and imagine a library of typical cameras and their associated blurs. We have similar libraries of film grain and sensor noises.
adastra22 5 hours ago [-]
This would seem to have huge forensic applications.
Does this lens blur change over time for a given phone?
spaqin 3 hours ago [-]
It doesn't, unless the camera is damaged. How the blur looks is a consequence of the lens' optical design.
zokier 2 hours ago [-]
I would assume the fingerprint would be extremely sensitive to alignment and positioning of the optical elements, and I don't find it far fetched that those could shift minutely during the lifetime of a device without getting to a point where the camera is actually damaged.
This is all very relevant if you want to do e.g. 3d reconstruction or view synthesis from some images, I imagine you can do much better the better knowledge you have of the camera.
Does this lens blur change over time for a given phone?