resizing-72ppi vs 300ppi?

... any article that uses dpi and ppi as though they mean the same thing is anything but correct.
I'm going to repost something I posted a while ago on the same subject:

"PPI" is a fairly recent term. Originally, image resolution was always specified as "DPI."

"Resolution determines the level of detail recorded by the scanner and is measured in dots per inch (dpi)."
  • User's Manual for Microtek Scanners", copyright 1995, p. 1-9
Resolution is also defined in terms of "dpi' in the official specs for JPEG and TIFF

JPEG File Interchange Format
http://www.w3.org/Graphics/JPEG/jfif3.pdf

Go to page 5 (page 6 in the PDF) and look at the definitions of "Units", "Xdensity," and "YDensity" and note "X and Y are dots per inch." (Or "dots per centimeter." Just be lucky that we aren't arguing over that. Yet.)

TIFF
http://partners.adobe.com/public/developer/en/tiff/TIFF6.pdf

Look at page 38 and scroll down to the "ResolutionUnit" spec. Note that they specify resolution as "dots per inch"

So don't be surprised when people express the resolution of an image using 'dpi', because that was (and still is, according the the JPEG and TIFF specs) the correct nomenclature for expressing image resolution in digital graphic arts for well over a decade. AFAIK, it was only fairly recently (in the lifetime of digital graphic arts) that the term "ppi' existed.

And (IMO) the introduction of 'ppi' hasn't reduced confusion. In fact, it only has increased confusion, as evidenced by the exchanges in this thread.

(End of repost)

You can't change history. I have been working with computer graphic arts since the early 1980s, and resolution has always been expressed in DPI. Read the JPEG and TIFF specs (above) if you don't believe me.

Wayne
 
The problem with the terminology has to do with the difference between an image created from scanning a photo or document and taking a photograph with a digital camera. You scan a printed image of a known size at a particular resolution which can properly be measured in dots per inch. Photographs are captured without a reference to the size of the subject matter. The same camera at the same focal length can capture a 10 megapixel image at less than an inch across to many miles across. That is why the number imbedded in most camera image files is completely bogus. It has no meaning or relevance. It is simply there to occupy a space in the data structure of the image file that was defined at a time when such a number had meaning. Maybe that is why some people try to make a distinction between PPI and DPI.
 
I have now read 4 books and searched the internet, and I am completely confused. The more I try to find out, the more confused I get.
This tends to be a very confusing subject. I doubt I’ll help :P

PPI – Pixels Per Inch, while DPI – Dots Per Inch. Pixels are usually used to refer to images and monitors. Dots are usually used to refer to printers and scanners.

Most image formats have a DPI setting. The original purpose of the DPI setting was to record the DPI used to scan a document. An image, by itself, has no size...it only has x number of pixels and y number of pixels. So in order to reproduce a scanned document in its original size, the scanning resolution (in DPI) was recorded with the image. So a document scanned at 200 DPI (such as a fax) can be printed at 200 PPI and you’d get the document in its original size.

Then inkjets came along and threw lots of confusion into the meaning of DPI. For several years there were “DPI wars” with ever increasing DPI numbers. The only problem was that an inkjet printer’s dot was not the same as an image pixel. Some modern inkjets can use 64 printer dots to mimic one pixel from an image.

Today, even the cheapest printer can produce a pretty good print, so the emphasis on printer performance has moved to other areas, such as accurate color reproduction and such. Very few consumers worry about, or even know about, the PPI/DPI settings because most people print to fill the paper they’re using. They send the image to the printer and let the printer figure how to print. Here’s what the printer does...

The printer takes the number of pixels (say 4000 x 3000) and divides by the dimensions of the paper (remember, you told the printer what size paper you’re using.) So for 8x10 paper it calculates 4000/10 = 400 and then 3000/8= 375. It discards the larger number and prints the entire image at 375 PPI. This will result in some pixels being discarded. That’s because a 4000 x 3000 image is slightly longer than it is wide, when compared to an 8 x 10 image.

That is what most people do. The printed images are simply scaled to fit the paper, and cropped to fit both dimensions.

Those of us that want the best print quality, however, take control of the PPI that our images are printed at. Going back to the previous example, before I would print that image I would resample the image to 6400 x 4800, creating a 30.7MP image. That’s because I want the image to print at 600 PPI. As to why I want to do that...that’s another discussion entirely. The point is that the resulting PPI of the image has nothing to do with the DPI setting in the file, and everything to do with the actual image dimensions vs. the actual print size of the paper. The way you can control the PPI that your image is printed at is by resampling. Most people will tell you it’s not necessary.

Probably the best thing you can do for your prints is to use Qimage to print your images.

.
 
I think the confusion comes down to two different ways of thinking about and measuring an image.

this is a high resolution image = it has lots of pixels. I call this image resolution

this is a high resolution setting/process = it involves lots of dots per inch. I call this input or output resolution depending on the cause of, and the purpose for, this setting.

The reason you might use a high resolution setting on a scanner, is because you want to use a process that will produce an image with many pixels, rather than fewer pixels. The reason you might use a high resolution printing setup, is because you want to represent the image as many small areas of colour, rather than fewer large areas of colour. The process setting used has an effect on the result.

If you spend time scanning 35mm negatives, say, then you may become used to thinking of a particular dpi scanner setting as being equivalent in practice, to a given level of detail in the overall image produced. However when given a different sized negative to scan, this experience would not transfer over directly - it is too specific for that; mental allowances would have to be made for the changed format.

When a digital camera has settings for high and low resolution, this is expressed entirely in terms of how many pixels. Do you want a 2 megapixel image, or a 10 megapixel image? No dpi options. So, No mental allowances or calculations need be made, as to what input ("scanning") dpi gives the desired result for each type of camera sensor.

In reality, a 2 MP and a 10MP image will both be saved with the same ppi setting that the camera always assigns; this tells you nothing about which one is more detailed. If you changed them both to 300 ppi, that would still tell you nothing about their relative quality and levels of detail. You would have to start exploring scenarios like, what would happen if I printed these two images using that setting, how big a printed picture would I get. In fact the software makes these scenarios quite easy.

Some people use print size at a given printing resolution as their measure, settling on and becoming familiar with one of these scenarios; and other people use the pixel dimensions or numbers . These are two ways, one indirect and the other direct, to assess the same thing: what I call image resolution.

The first measure is transparently straightforward when it matches up exactly with the outcome you want, and quite awkward when it doesn't; the second measure is fairly easy to apply toward any possible outcome - it is neutral toward that.

Then when the time comes, one can later think about output resolution; what process do I want to put this image through now.

Is the image OK as it stands, or do I have to resample it (a copy of it) to get the result I want, or have I already converted the image to a ppi that happens to be the same as the desired output ppi, which suits the printer's hardware dpi, at the precise printing size I require.

(If the image is the "right" ppi at the "wrong" printing size, then that's no help - it will all get resampled anyway.)

Or, is my output not expressed in terms of ppi at all, but only in terms of needing particular pixel dimensions within a resampled copy - say, for the web.

RP
 
The printer takes the number of pixels (say 4000 x 3000) and divides by the dimensions of the paper (remember, you told the printer what size paper you’re using.) So for 8x10 paper it calculates 4000/10 = 400 and then 3000/8= 375. It discards the larger number and prints the entire image at 375 PPI. This will result in some pixels being discarded. That’s because a 4000 x 3000 image is slightly longer than it is wide, when compared to an 8 x 10 image.
If you do a little more research, I think you will find the statement is not correct. Printers always print at a their native resolution or some multiple thereof. They do not print at the resolution you feed in to the print software. The resolution of any printer is hardware defined by the properties of the stepper motors that drive the print head. They are called "stepper" for a reason. The print head can not move an arbitrary distance. Before the printer ever gets fed any data, the print software resamples the image to create the correct resolution. Using your example, if the native resolution of the printer is 300 dots per inch, it can never print at 375 dots per inch. The image will be resampled in software to 300 dots per inch.

The native reolution of printer hardware is not always 300 dots per inch. That is why it doesn't make any sense to do your own resampling to the equivalent of 300 dots per inch when printed. Unless you know what the native print resolution is and you tell the printer software or the person using it not to resample, your image is going to get resampled before printing anyway. Some folks think their photo editing software does a better job of resampling than the printer software so they do their resampling ahead of time. That only works if you know what the printer resolution is and you make sure the default resample does not occur.
 
If you do a little more research, I think you will find the statement is not correct. Printers always print at a their native resolution or some multiple thereof.
Actually, they always physically print color at one resolution, which is 600 PPI for Canon and 720 for Epson...never a multiple thereof. This has been tested and proven. That doesn't change the fact that the resulting image resolution is 375 PPI. Just because the printer resizes the image, doesn't mean the true image resolution has increased. The maximum detail in the printed image is 375 PPI, and I didn't want to get into the nitty gritty of how printers handle images of different resolutions.

Canon printers resize images using the Nearest Neighbor algorithm, which is terrible. That’s why some people would rather resize the image to the native resolution of the printer using more sophisticated algorithms, and then apply print-sharpening to the resulting image. That’s what Qimage does, and anyone who uses Qimage can tell you that the difference between Qimage prints and prints straight out of Photoshop can be dramatic, with the Qimage prints showing greater and sharper detail. The Qimage print doesn’t have any more detail than the original...it simply loses less detail in the printing process (which actually tends to muck up a good bit of detail.) But that’s the difference between sending 600 PPI of detail vs. 375 PPI of detail to the printer.

.
 
If you do a little more research, I think you will find the statement is not correct. Printers always print at a their native resolution or some multiple thereof.
Actually, they always physically print color at one resolution, which is 600 PPI for Canon and 720 for Epson...never a multiple thereof. This has been tested and proven. That doesn't change the fact that the resulting image resolution is 375 PPI. Just because the printer resizes the image, doesn't mean the true image resolution has increased. The maximum detail in the printed image is 375 PPI, and I didn't want to get into the nitty gritty of how printers handle images of different resolutions.
Of course printers can be set to print at a multiple of their native resolution. That is a standard feature on any cheapo printer you could buy. They all have some kind of speed versus quality setting. However, I can see how my use of the word "multiple" could be confusing. By "multiple" I mean the stepper motor that moves the print head 1/720 of an inch can also move the head 2/720 or 1/360 of an inch. That has the effect of reducing output resolution.

I agree that it isn't possible to manufacture true resolution from a resampling algorithm. However, it is not true, as you implied, that the printer will output 375 dots if that is what you provide to the software to start with. You obviously understand the concept of resampling and its effect on quality, which is the reason for my original point. I don't understand why you would not want to say that in the first place instead of saying a printer will output at an arbitray 375 dots per inch. I think an understanding of resampling is important to answering the original poster's question.
Canon printers resize images using the Nearest Neighbor algorithm, which is terrible. That’s why some people would rather resize the image to the native resolution of the printer using more sophisticated algorithms, and then apply print-sharpening to the resulting image. That’s what Qimage does, and anyone who uses Qimage can tell you that the difference between Qimage prints and prints straight out of Photoshop can be dramatic, with the Qimage prints showing greater and sharper detail. The Qimage print doesn’t have any more detail than the original...it simply loses less detail in the printing process (which actually tends to muck up a good bit of detail.) But that’s the difference between sending 600 PPI of detail vs. 375 PPI of detail to the printer.
I can't comment on the merits of Canon printers or their printer driver software because I have never owned one. Neither can I comment on Qimage although I have heard it is a good program. What I can say is that on the Epson and Hewlett Packard printers I have owned, which are mostly professional models (my wife is a graphic designer and we use them for proofing in our business), you can't tell any diffeerence in the output whether you resample or not. The printer resampling algorithm works as well as the one in PS CS4. I have also found this to be true of commercial photo printing services. As I said in another post on this thread, people who want to find out for themselves can easily do a controlled test using resampled and original resolution files to find out.

We have been in the graphics design business for 13 years, I can say that the single biggest problem we encounter with the final quality of a customer's publication is their misguided efforts to resize their pictures before giving them to us. The math involved doing an appropriate resample is confusing and the probability is high of doing much more harm than good. I realize that this doesn't apply directly to the original poster's question, but the probability of doing a misguided resample that degrades image quality is still there.

By the way, just to muddy the waters a little more, it isn't really true that your Canon prints color pixels at 600 dpi, no matter what the sales literature says. The software uses a dithering algorithm at that resolution to dispense inks of cyan, magenta and yellow to create a color pattern that is much, much lower in resolution than 600 dpi. That number is used as misleading advertising ploy.
 
...

You can't change history. I have been working with computer graphic arts since the early 1980s, and resolution has always been expressed in DPI. Read the JPEG and TIFF specs (above) if you don't believe me.
...
Wayne, I do believe you about the historical usage of the terms dpi and ppi, even without checking the references ;)

But, the OP’s question is about how and when to resize image files for printing. It had nothing to do with the origin or historical usage of digital-imaging terminology, and had nothing whatsoever to do with scanners.

These ScanTips articles are too old to be helpful to beginners. They are mostly about history and issues that were more relevant 15 years ago than they are to the OPs question. Worse, because terminolgy has changed they are confusing to beginners. The author’s use of the term dpi, for one, differs from the way it is used today by camera, printer, and software makers. Now I’ll say the same for the the word “video,” which is seldom used anymore to refer to a computer monitor. A recent post from the OP makes my point: In http://forums.dpreview.com/forums/read.asp?forum=1002&message=32682840 the OP says:

So, is this quote telling me again that I need to resize to 300ppi before printing?…Also, this described a lot of video-viewing, which I don't really care about right now. I want to know about printing my photos in the best manner.

The articles do not explain that printer drivers include upscaling algorithms, nor do they address the relative resampling quality of today’s driver, printing software (Photoshop CS series, for example) and specialized upscaling programs. For someone trying to get a grasp on contemporary terminology, and whether/how to rescale their image files to 300 ppi these articles should not be on any short list of reading materials.

Regarding the dpi/ppi language, it may seem like a nitpick, but it is a frequent source of confusion. Camera specifications and image files talk of pixels . A pixel of course is a complete picture element of defined tone and color. Printing software UIs also refer to pixels as well as pixels per inch (ppi) , but printer specifications are in dots per inch (dpi) . In today's language, a printer dot is much smaller than a printed pixel. It takes several dots of ink to generate the colors and tones possible in a single pixel. That’s why printer specs include large numbers like 1440 or 2880 dots per inch. No printer can lay down that many pixels per inch. A printer dot does is not the same as a printed pixel.

--
JerryG

My galleries at:
http://www.pbase.com/jerryg1
 
The problem with the terminology has to do with the difference between an image created from scanning a photo or document and taking a photograph with a digital camera. ...
You make a good observation about scans and photographs.
Maybe that is why some people try to make a distinction between PPI and DPI.
More importantly, because they are different quantities when applied to printers and image files. Ink jet printers lay down several dots to print a single pixel. (That's the only way to get a handful of inks to reproduce all those beautiful photographic colors.)
--
JerryG

My galleries at:
http://www.pbase.com/jerryg1
 
Funny you should mention that. I mentioned that same thing in another post an hour or two ago. I really hate to get into dithering algorithms, postscript printing and halftoning. It is no wonder a lot of people have trouble with this printing thing!
 
Of course that is right, what you are trying to abvoid is not having enough res if you cropped or resizing to stupid sizes for web display ;)
Gotcha. Let me recommend you do a little experiment. Have your print service print out two 8 X 10 photos from the same original file. On one of these files, adjust the resolution to exactly 300 pixels per printed inch. For the other print, just give them the original file with no resampling done. I will bet you that you will not be able to tell any difference in the two prints. The higher resolution one may actually look a little better. I have done this experiment several times and come to the conclusion that resampling (resizing as you call it) is a completely useless step.

My opinion is all you need to do for printing purposes is make sure the H to W ratio is what you want so the technician won't get the opportunity to crop the image the wrong way.
--
***********************************************
Please visit my gallery at http://www.pbase.com/alfisti
Pentax Lens examples at http://www.pbase.com/alfisti/images_by_lens
Updated January '09
 
The problem with the terminology has to do with the difference between an image created from scanning a photo or document and taking a photograph with a digital camera. You scan a printed image of a known size at a particular resolution which can properly be measured in dots per inch. Photographs are captured without a reference to the size of the subject matter. The same camera at the same focal length can capture a 10 megapixel image at less than an inch across to many miles across. That is why the number imbedded in most camera image files is completely bogus. It has no meaning or relevance.
It is used to specify the resolution that image is designed to be printed at. Programs that are used to lay out pages read this field and size images accordingly. I just tested this with InDesign. I loaded an image from a publication that I had worked on into PhotoShop. Image Size showed:

Width: 2588 pixels
Height: 1936 pixels
Width 8.627 inches
Height 6.453 inches
Resolution: 300 pixels/inch

I changed Resolution (with "Resample Image" unchecked) to be "600". Now Image Size shows

Width: 2588 pixels
Height: 1936 pixels
Width 4.313 inches
Height 3.237 inches
Resolution: 600 pixels/inch

And another one at 72 DPI/PPI.

Width: 2588 pixels
Height: 1936 pixels
Width 35.944 inches
Height 26.889 inches
Resolution: 72 pixels/inch

I saved them with different file names. I created a new InDesign document and "placed" the images. The 300 DPI/PPI one showed up in InDesign 8.6 inches wide. The 600 DPI/PPI one showed up as being 4.3 inches wide. The 72 DPI/PPI one was waaaaaay big.

You and I know what is going on. But layout artists often do not. Or don't want to be bothered figuring it out. Or are dealing with hundreds of images and can't afford the time. They want an image to come in at the maximum size that it can be printed at (at 300 DPI/PPI, usually.) This subject has been raised several times in the DPReview Pro forum, usually in the form of a "why are my image being rejected?" type question. The answer that has come back from photographers that have a lot of experience with submitting for prepress is that some publishers are inflexible. They require images to have the DPI field set at 300 DPI (or whatever their requirements are.) If this isn't done, then the images are not acceptable. Period. End of story.

If submitting images to prepress isn't part of your workflow, then you can ignore the DPI field. Ditto if you work with clueful publishers that understand the situation and adjust accordingly.

But it does a disservice to other photographers to indicate that this it is universally true that the DPI field can be ignored. Because there are real world circumstances that require that the DPI field be set appropriately.

Wayne
 
Wayne,

Ok. You win. What you have said is true. My wife has run a graphic design business for 13 years and I have seen exactly the things you are talking about. However, to introduce the technology of page layout into a thread in which some guy is trying to decide how or whether to "resize" his images is not going to help him at all. I always hesitate to provide information that could be confusing and is not germaine to the question at hand. I hope the OP does not read your post and get even more confused by information that doesn't apply.

By the way, if someone has his images rejected for publication because the DPI field is not set to 300, he/she needs to seek out another designer/printer. There are many compaines which will gladly take the business away from these arrogant fools. We used to publish a 150 page book yearly for a dance studio that included 500 - 600 photographs from a wild varety of sources. We were more than happy to adjust the images and try to make the best of 1 megapixel image at a bogus "72 dpi" for the kind of money we were paid.
 
These ScanTips articles are too old to be helpful to beginners. They are mostly about history and issues that were more relevant 15 years ago than they are to the OPs question. Worse, because terminolgy has changed they are confusing to beginners. The author’s use of the term dpi, for one, differs from the way it is used today by camera, printer, and software makers.
The problem with attempting to redefine "DPI" is that there is lots of older written material that use the term as it was originally defined. Attempting to change the definition of "DPI" causes confusion. Because this older written material exists and will be read, as we have seen in this thread.

If you are worried that people will be confused about inkjet printer DPI ratings, It would be much better, IMO, to first explain how inkjet printers work and how the inkjet printer DPI ratings correlate to the 24 bit pixels that we see on monitors. Explain that different technologies uses "dots" differently. And that "DPI" has different meanings, depending on the technology that it is applied to. (DPI is the same as PPI when applied to monitor pixels. And the pixels that scanners return. But DPI refers to a diffusion pattern when applied to ink jet printers and has no direct correlation with image pixels. Ditto for half tone dots.)

Look, you understand how it works. Wouldn't it be better to explain the concepts so that the person understands the concepts the same way that you and I do? It seems to me that a strategy that depends on people not reading valid material that is older than X years old is doomed to failure. ("Oh, and don't read the specs for JPEG or TIFF either.")

Wayne
 
Ok. You win. What you have said is true. My wife has run a graphic design business for 13 years and I have seen exactly the things you are talking about. However, to introduce the technology of page layout into a thread in which some guy is trying to decide how or whether to "resize" his images is not going to help him at all. I always hesitate to provide information that could be confusing and is not germaine to the question at hand. I hope the OP does not read your post and get even more confused by information that doesn't apply.
I don't think it helps to distort information. If you have prepared images for use with InDesign, then you know that no particular good comes from setting the DPI field to, say, 5 PPI. So it isn't helpful to tell somebody that the DPI field is "bogus." Because sometimes the field is used.

Wouldn't it be better to explain that some of the time the DPI field isn't used, but other times it is used? And explain the circumstances for each. i.e., if they will be printing on their own printer, then they probably don't need to set the DPI field (unless they will be using a program like InDesign.) But if they are preparing images to submit for publication, then it probably is a good idea to set the field appropriately.

I hope the OP looked at the height/width/resolution tables in my post and saw the relationships between the various parameters. It isn't that difficult to understand when you see everything. It is sort of an "Oh, that's all it is" thing.
By the way, if someone has his images rejected for publication because the DPI field is not set to 300, he/she needs to seek out another designer/printer.
I believe that the threads I was referring to involved photographers attempting to sell their images to publishers. Magazines, and such. If the images are rejected, then the photographers didn't get paid.

Wayne
 
These ScanTips articles are too old to be helpful to beginners. They are mostly about history and issues that were more relevant 15 years ago than they are to the OPs question. Worse, because terminolgy has changed they are confusing to beginners. The author’s use of the term dpi, for one, differs from the way it is used today by camera, printer, and software makers.
The problem with attempting to redefine "DPI" is that there is lots of older written material that use the term as it was originally defined.
For the record, I am not attempting to redefine anything. I'm just pointing out how the terms dpi and ppi are used today in relation to printers and image files.
... It seems to me that a strategy that depends on people not reading valid material that is older than X years old is doomed to failure....
Why? Would you suggest reading a 1980 auto repair manual to fix your 2009 car? I question the validity of beginner material for that uses outdated terminolgy. It is one thing to be conceptually correct, but quite another to communicate effectively to the target audience.

It seems to me that a strategy that depends on people not reading current valid material is doomed to failure.

Differences aside, you've got to laugh with me at how a plea for help from a beginner turns into a pixxing contest for old men ;)

Have any of us even given the OP a clear direct answer? I think he may have given up all hope of ever getting one!

--
JerryG

My galleries at:
http://www.pbase.com/jerryg1
 
I just reread the original post and my interpretation is the guy is wanting to know nothing more than how he should "resize" his image for printing. My advice was and is to do nothing to it other than crop it to the correct ratio for the size of print he wants. If he wanted our company to create a brochure or something of that nature using his photo, I would have been even more emphatic that he do nothing to the image and just provide the original file. The overwhelming majority of designers and printers we work with would say the same thing. Almost all attempts by non-professionals to prepare images for publication do more harm than good. I don't know what is required for submitting an image for sale, but I am pretty certain that was not the OP's question.

We could go on and on about the process of resizing an image and placing it in a page layout program but that would only confuse the issue and give the impression we are trying to impress somebody instead of help them.
 
Differences aside, you've got to laugh with me at how a plea for help from a beginner turns into a pixxing contest for old men ;)
True. We are DPReviewers to the core.
Have any of us even given the OP a clear direct answer? I think he may have given up all hope of ever getting one!
I tried to with my "InDesign" response to Cedarhill. I was hoping that the example, with the tables showing the relationships between number of pixels/height and width/DPI(PPI) setting, would help explain what is going on.

The subject is moderately complex, such that it is difficult to give an answer in a forum post that is both complete and accurate. Here are some attempts by others:

Image Resolution for Printing
http://www.printingforless.com/images.html

Image Resolution And Print Quality
How The Resolution Of Your Digital Images Affects Image Quality When Printing
http://www.photoshopessentials.com/essentials/image-quality/

Printing quality photos

http://www.microsoft.com/windowsxp/using/digitalphotography/learnmore/bestquality.mspx

In short, your image should have enough pixels that it will contain at least 240 pixels per inch, for the best print quality. Beyond 300 pixels per inch, you reach a point of diminishing returns (and printers probably can't use the extra information.) Below 240 PPI, image quality starts to deteriorate, but might be acceptable. (If the image is printed large enough that you will stand back to view it, then your image can have less pixels per inch.)

The formula is simple: "number of pixels on a side" / "size in inches" of the final print. If an image has 3,000 pixels on a side, then that side can be printed ten inches long (or wide) at 300 PPI. If it contains 2,400 pixels and you still want that side to be 10 inches long, then the resolution will be 240 PPI. 720 pixels == 72 PPI. etc. And vice versa. More pixels per inch == more resolution. More resolution == more visible detail. Less resolution == less visible detail.

And, image resolution traditionally was expressed in DPI (dots per inch). At the present time, many people prefer to express image resolution as PPI (pixels per inch) and reserve "DPI" for measuring the fineness of the diffusion patterns that ink jet printers use to make the image. Printer DPI is not directly comparable to image DPI (or PPI.)

Older documentation (and some newer documentation) expresses image resolution as DPI. It means the same thing as PPI.

Wayne
 
Of course printers can be set to print at a multiple of their native resolution. That is a standard feature on any cheapo printer you could buy. They all have some kind of speed versus quality setting. However, I can see how my use of the word "multiple" could be confusing. By "multiple" I mean the stepper motor that moves the print head 1/720 of an inch can also move the head 2/720 or 1/360 of an inch. That has the effect of reducing output resolution.
Maybe the motors are physically capable of moving 1/360th of an inch...but the print head is incapable of laying down a single huge dot. That's why the printer always operates at one resolution. When a printer can create dots of different sizes it's always smaller dots for those high-quality settings where, for example, the Canon will print at 9600 x 2400 DPI. That's why, in my original post, I noted that some printers will use up to 64 dots to recreate a single pixel.
I agree that it isn't possible to manufacture true resolution from a resampling algorithm. However, it is not true, as you implied, that the printer will output 375 dots if that is what you provide to the software to start with. You obviously understand the concept of resampling and its effect on quality, which is the reason for my original point. I don't understand why you would not want to say that in the first place instead of saying a printer will output at an arbitray 375 dots per inch. I think an understanding of resampling is important to answering the original poster's question.
I guess this is simply one of those cases where a simplified explanation was provided and then there's disagreement on whether the explanation was too simple. If the person doesn't realize that the DPI setting doesn't mean anything, then he's really at the beginning of the road to understanding resolution and print quality. At this point I'm just trying to get the OP to understand how the...for lack of an official term, logical PPI is determined. If a person understand that, and then learns about the necessary PPI levels for differernt size prints, then he has all he needs to know to prevent mistakes like trying to have a tiny JPG printed on poster paper. Also, since many people use commercial printers, the whole actual-printer-resolution thing is really one step removed...especially when you have no idea what printer is being used. A Durst Theta 76 will print at 254 DPI, whereas a Noritsu will print at 400 DPI. So you think a discussion on resampling is important, I don't think it is, and I think only the OP can say for sure which is right.
By the way, just to muddy the waters a little more, it isn't really true that your Canon prints color pixels at 600 dpi, no matter what the sales literature says. The software uses a dithering algorithm at that resolution to dispense inks of cyan, magenta and yellow to create a color pattern that is much, much lower in resolution than 600 dpi. That number is used as misleading advertising ploy.
My canon prints at 2400x2400 actual DPI (and up to 9600x2400 with the right paper) in order to reproduce 600 actual image pixels. So it uses a minimum of 16 does and a maximum of 64 dots to recreate a single image pixel.

There are several threads in the printer area where we have dissected this to death. My printer will lay down six bands of color for every hundredth of an inch. The paper used is very important. Only the high-quality papers can maintain such thin lines. Also, only the high-end printers can maintain straight lines. You learn a lot about what your printer can and can’t do when you try to print 6 different color lines, each one pixel wide and side by side, at 600 PPI.

.
 

Keyboard shortcuts

Back
Top