Children’s Book Collection from Apple’s iPhone

November 19 2008 / College Park, MD

For immediate release

COLLEGE PARK, MD – November 19, 2008 – The International Children’s Digital Library (ICDL) (http://www.childrenslibrary.org), which is the world’s largest collection of children’s literature available freely on the Internet, today announced the release of the ICDL for iPhone application.

Available free at Apple’s iPhone App Store, the ICDL for iPhone application allows users to take advantage of the advanced capabilities of the iPhone and iPod Touch user interface to read a selection of books from the ICDL’s master collection, which today represents thousands of children’s stories from 60 countries. The children’s books can be read in their native language and in English.

The ICDL for iPhone application features ICDL’s ClearText technology which was designed to make it possible for users to read story text clearly in the context of highly illustrated beautiful children’s picture books — even on the small mobile screen. The ICDL for iPhone application will be updated regularly as new books are made available.

Additional features of the ICDL iPhone application include:

  • Offline reading — access the International Children’s Library to read on and offline
  • Online reading — linkage to the full ICDL collection which has over 3,000 titles in 48 languages representing 60 countries
  • Simple navigation — view books with engaging animations quickly
  • One or two page view — takes advantage of iPhone’s “auto-rotation”
    feature
  • ClearText — allowing exceptionally clear text in the context of highly illustrated pages

The ICDL iPhone application syncs via Wi-Fi or the user’s cellular network and downloads the latest featured children’s books directly to the device, giving children and parents access to content offline and in airplane mode. Compatible with any iPhone or iPod Touch with operating system version 2.0, the application was designed by International Children’s Digital Library Foundation with support from Zumobi and the University of Maryland’s Human-Computer Interaction Lab.