I am playing with CSS to try and get a hang of setting my font sizes based on a device's DPI. I did not set the applicationDPI at the application level, assuming that if it is not set, it will set itself to an appropriate DPI bucket. Then, I trace the applicationDPI and the Capabilities.screenDPI. When I debug it on my iPad, it traces that it's Capabilities.screenDPI is 264, which according to the link below, should map the device to a DPI bucket of 240. But it traces the applicationDPI as 320, and the CSS rule that it is selecting is the 320 rule. How do I get the application to set the applicationDPI to 240 when the Capabilities.screenDPI is 264, and force it to use the 240 CSS rule?
Is it a retina iPad? There is a list of exceptions listed in this link
, which says:
- All retina iPads receive 320 DPI
If you see the code in , you can see that iPad retina's are simply
assigned a DPI of 320 regardless of what Capabilities.screenDPI reports.
The good news is that this is configurable, you can simply supply a
runtimeDPIProvider class to Application and implement your own logic to
calculate the DPI as per your application needs. An example can be seen
here:  and 
> In other words, based on a Capabilities.screenDPI of 262, why isn't the
> applicationDPI or runtimeDPI being mapped to 240 as one would expect, and
> therefore selecting the incorrect CSS rules?
> View this message in context: http://apache-flex-users.
> Sent from the Apache Flex Users mailing list archive at Nabble.com.