Add new comment

>Please, in your own words,

>Please, in your own words, take some time to try and understand what the article is about.

I'm pretty sure I understand where you're going with it. I just think the fundamental underlying assumptions are flawed. On top of that, you don't clearly outline what is an issue with the format, and what is an implementation issue.

>In doing so you will find the answers to many of your questions right in the article, including specific suggestions and the archival value of allowing more proprietary tags like colour decoding profiles of un-specified structure.

Well, the DNG spec does currently provide for anyone to embed profiles that describe the color any way they wish. If a manufacturer wanted to create his own tags and embed, or make profiles available, it could do that, as I understand it.

I think the far more important and challenging issue is how to deal with tags for further decoding and output of the files. As imaging technology moves from simply trying to render the files "correctly" to making more creative interpretations from the captured data, the challenge becomes attaching the custom profiles that can control that output, and keeping them available as the file gets passed from user to user. In this regard , the new 2.5-D profiles that the DNG allows is a really ingenious solution. I suggest checking them out.

Charting the spectral response of the camera's chip is far less important than enabling the user to create a portable parametric rendering. I

Once again, all parametric renderings will be engine-specific, and are likely to change somewhat over time, as applications are improved. (And those are implementation decisions made by the application's creator, and not an issue of format.) In fact, the rendering does not even *exist* until the raw data is loaded into the imaging pipeline along with the instructions. That's why the ability to attach a fixed rendering is so important. It is the *only* possible solution to create a totally predictable rendering, unless we stop software from developing any further.

Likewise, *backwards* compatibility of parametric rendering engines simply cannot work the way you outline unless:
1. The engines are not allowed to change and improve. or
2. A software manufacturer is obligated to update all old versions with new capability whenever it's developed.
Neither one of these will happen.

>By the way, in your opinion, shall we consider that the history of DNG starts with v. 1.2 and ACR v. 4.5?

I'd say 1.1, rather than 1.2. That was when all image data,a swell as all private makernotes was first transferred. Prior to that, the format definitely left some stuff behind.

>In that case, it is not a bad start. But what to do with the users invested into DNG at v.1.0 stage? Were they misled?

Sure, if that's the way you want to look at it. The first implementation had problems. It would be great if all 1.0 implementations were problem-free.

>We would appreciate if instead of simply ruling out the results of our experiment as purposely skewed you would do those yourself and post the results proving your point.

Did you understand my criticisms? The premise of what you are asking for is impossible and/or undesirable.

>As one of the methods of verification archival properties, we suggest you make a DNG out of a RAW file, and then delete from a copy of that DNG all vendor-specific EXIF fields to evaluate the difference in processing results between the "pure" DNG stripped of those vendor-specific EXIF fields and more complete data.

I have no idea what you're driving at here. Are you saying that some DNG-aware applications can make use of EXIF?

When you talk about "archival" what are you really talking about? Are you talking about access to the metadata, access to a particular rendering, or the survivability of the format.

The first of these is really the province of the camera maker, who encode the data, and could make it more discoverable. The second one can *only* be addressed by an embedded rendering. And the third is an area that you glossed over entirely in your discussion of the spec. The data validation tags in DNG 1.2 are a huge leap forward in archive stability, as they offer a way to determine the file integrity automatically. I suggest you check out this part of the spec.

>We would also love to see your priorities being sorted out. You begin your second paragraph with "most important", and continue with "most importantly" in the paragraph next to your last one. Such phrasing indicates that you were rather sloppy, again, to use a word of yours, while writing your comment.

Okay, fine, it was sloppy. I should have said that all of those things are important. It's important to read and understand something before you send out a purportedly comprehensive discussion of it. And it's also important to understand what is the problem with the spec, and what is the problem with any specific implementation. And it's important to understand what's possible, and what's not.

>It is our feeling that the readers of your comment would benefit from a disclosure more detailed than just The DAM Book author; like mentioning your special relations with Adobe in the status of Adobe Alpha Tester.

Okay, yes, I am an unpaid alpha tester for Adobe, which gives me access to the development teams. And I helped Adobe improve the DNG spec, with particular emphasis on addressing the issues raised by other software vendors, such as Bibble (ask Eric - he brought specific problems to light, and they were addressed in 1.2). I also focused on the ability to facilitate predictable rendering, which can only be achieved by use of embedded rendered data. In addition, Adobe occasionally pays me (poorly) to speak. Adobe also paid me to write a white paper on this subject.

http://www.adobe.com/digitalimag/ps_pro_primers.html

I have also licensed photographs to Adobe in my capacity as a professional photographer.

Peter Krogh