ARToolKit Professional for iOS release notes

From ARToolworks support library

Jump to: navigation, search

Main Page > ARToolKit Professional > ARToolKit for iOS > ARToolKit Professional for iOS release notes

Latest release: 18.0.

Contents


README

Supplementary read me for ARToolKit Professional for iOS, release 18.
=====================================================================
Contents.
---------
About this archive.
Requirements.
Installing.
Getting started.
Training new markers.
Release notes.
Next steps.
Notes on the ARApp design.
About the camera.
About this archive.
------------------- 
This archive contains the ARToolKit Professional libraries, utilities and
examples for iOS, release 18.
ARToolKit Professional for iOS is released to you under a proprietary license.
Please note that your license terms impose various restrictions on distribution
of ARToolKit in both source and binary forms. Legal remedy will be sought by
ARToolworks, Inc. for any unauthorised distribution. Please do not attempt to
publish your app on Apple's App Store until you have obtained a valid license
from ARToolworks.
This archive was assembled by:
    Philip Lamb
    ARToolworks, Inc.
    http://www.artoolworks.com
    2013-04-09
Requirements.
-------------
Requirements:
 * Xcode tools v4.5 or later and iOS SDK v6.0 or later, running on Mac OS X 10.7
or later.
 * These devices are supported: iPhone: 3GS, 4, 4s, 5. iPad: 2, 3, Mini, 4. iPod
touch: 4th Generation, 5th Generation. The device must be running iOS 4.3 or
later. The iPhone 3GS is not supported for NFT use.
 * Paid membership of Apple's iPhone Developer Program.
 * A valid iOS Developer Certificate issued by Apple.
 * A printer to print the pattern PDF images "Hiro pattern", "Kanji pattern",
"Sample1 pattern" and "Sample2 pattern".
Installing.
-----------
ARToolKit Professional for iOS differs from ARToolKit for other platforms, in
that most of the libraries and utilities are supplied in binary-only SDK form,
with additional source for the examples. If you wish to experiment further with
ARToolKit techniques (beyond those provided in the examples) you might wish to
use this iOS release alongside a release of ARToolKit Professional v4.x for Mac
OS X, Windows, or Linux.
The SDK is supplied as an archive file (.tar.gz or .zip file) which need only be
unpacked to a location of your choice, e.g. ~/SDKs/. Drop the archive into your
chosen location and double-click it in the Finder to unpack it.
Getting started.
----------------
The ARToolKit4.xcodeproj file contains six examples of using ARToolKit in an iOS
application.
 * ARApp shows only the most basic usage, which is to track the Hiro marker and
draw a colorcube on the marker.
 * ARApp2 shows additional techniques of loading Wavefront .obj models,
including textures, from disk, and tracking and drawing on more than one marker.
It also includes the required code to perform basic finger-tap interaction with
the virtual environment, although this is not currently used in the application.
 * ARAppOSG extends the ARApp2 example with loading of models using
OpenSceneGraph (.osg and .ive formats) including animation, effects, lighting
and more.
 * ARAppMovie extends the ARApp2 example with loading and playback of movie file
in the 3D virtual environment.
 * ARAppNFT shows basic NFT usage, which is to track the "pinball.jpg" image and
draw a model (using the Eden Library WaveFront .obj renderer) on the image.
 * ARAppNFTOSG shows basic NFT usage, which is to track the "pinball.jpg" image
and draw OpenSceneGraph-based models (using the OSG libraries) on the image.
See the section "About the ARApp design" for more information on the design of
these apps.
Note that the applications will NOT BUILD AND RUN as supplied; you need to
follow the standard procedure for setting up code-signing for an application, as
documented by Apple at
http://developer.apple.com/library/ios/#documentation/Xcode/Conceptual/
iphone_development/128-Managing_Devices_and_Digital_Identities/
devices_and_identities.html. This includes these steps:
1) Obtain a iPhone Developer certificate from Apple.
2) Choose a Bundle ID for the application, e.g. "com.mycompany.myapp" (you can
also use a wildcard ID which can be used to be any application).
3) Using Apple's iPhone Developer portal, generate a provisioning profile
including that Bundle , your iPhone Developer certificate, and the UDIDs of the
devices you wish to test on.
4) Set these settings in the "Build settings" for the ARApp, ARApp2, ARAppOSG
and ARAppMovie targets in the Xcode project. I.e. edit the files
ARApp-Info.plist, ARApp2-Info.plist, ARAppOSG-Info.plist and
ARAppMovie-Info.plist, replacing "com.artoolworks.${PRODUCT_NAME:identifier}"
with your Bundle ID.
5) Build the application with code-signing enabled.
ARToolKit cannot be used on the iOS simulator. If you've never done any iOS
development before, you will benefit by first getting some experience with
Apple's SDK and examples so that you're familiar with building and running an
iOS app on an actual device. ARToolworks does not typically provide assistance
with these steps.
Training new markers.
---------------------
The utilities required to train new square and NFT markers are provide in the
"bin" directory of the SDK. The utilities are command-line Mac OS X executables
which should be run from a Terminal environment.
Consult the ARToolworks support library for more information:
    http://www.artoolworks.com/support/library/
Creating_and_training_new_ARToolKit_markers
    http://www.artoolworks.com/support/library/ARToolKit_NFT
Usage:
    ./mk_patt
	./genTexData somejpegfile.jpg
	./dispImageSet somejpegfile.jpg
	./dispFeatureSet somejpegfile.jpg
	./checkResolution Data/camera_para.dat (change camera_para.dat to the
correct camera calibration file for your device, camera and resolution).
Release 18 release notes.
-------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.9. See ChangeLog.txt for details.
Release 17 release notes.
-------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.8. See ChangeLog.txt for details.
Release 16 release notes.
-------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.6. See ChangeLog.txt for details.
Release 15 release notes.
-------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.4. See ChangeLog.txt for details.
Release 14.0 release notes.
---------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.3. See ChangeLog.txt for details.
Release 13.0 release notes.
---------------------------
This release includes iOS-specific enhancements and bug fixes, as well as
updating ARToolKit to version 4.6.2. See ChangeLog.txt for details.
Release 12.0 release notes.
---------------------------
The key change in ARToolKit for iOS release 12 is the shift caused by Apple's
introduction of iOS 6.0 and the iPhone 5/iPod touch 5th Generation. Xcode 4.5 is
required to build apps targeting iOS 6.0. At the same time, any app that targets
iOS 6 may no longer target any release older than iOS 4.3. One of the underlying
reasons for this change is that support has been dropped for the older ARMv6 CPU
instruction set architecture (ISA), and with it the iPhone 3G. The 3GS and all
later devices are capable of executing the ARMv7 ISA. The iPhone 5 also
introduces a new ARMv7s ISA.
ARToolKit has been changed to match these requirements from Apple. All dependent
libraries included in binary form are now built as ARMv7/ARMv7s fat binaries.
Project settings have been changed to reflect the new minimum OS of iOS 4.3 and
minimum Xcode version 4.5, and some support files have been added to support the
new dimensions of the iPhone 5 and iPod touch 5th Generation.
See the upgrade guide at
http://www.artoolworks.com/support/library/
Updating_an_AR_application_with_the_latest_ARToolKit_for_iOS_example_code for
more information.
This release also incorporates miscellaneous other changes. See the
ChangeLog.txt for details.
Release 11.0 release notes.
---------------------------
This release combines ARToolKit for iOS and ARToolKit NFT for iOS in a single
package.
Other changes:
- For the NFT examples, non-essential components have been removed from the
included OpenCV static libraries. This should result in smaller final app
binaries for NFT apps. Please change your linker invocation to link against
libopencv-core.a and libopencv-flann.a instead of libOpenCV.a.
- Although the iOS camera calibration data is now available internally to
libARvideo, a copy of this data as distinct camera parameter files has been
placed into the bin/Data directory. These files may be useful when training new
markers.
Release 10.0 release notes.
---------------------------
This release introduces a change in the way ARToolworks-supplied camera
calibration data is handled on iOS. The multitude of calibration files required
to be distributed in the "Data2" folder has been removed, and the calibration
data is now embedded directly in libARvideo on iOS. The appropriate camera
parameter is identified by libARvideo and can be retrieved directly by a new
function (either arVideoGetCParam or ar2VideoGetCParam). Alternately, if no
pre-supplied data is available (e.g. when the version of libARvideo predates the
date of release of a new device) this function returns NULL and the user should
supply a default camera parameter file. The changes can be seen in the
ARViewController class's -start method.
NFT Release 6.0 (2012-05-08).
-----------------------------
- This release adds support for loading multiple NFT surfaces. One NFT surface
can be active at any one time from a set of up to 10 loaded surfaces. It also
incorporates changes in ARToolKit for iOS release 9.0. 
Release 9.0 release notes (2012-05-08).
---------------------------------------
This release introduces some code changes which require compilation with the
LLVM compiler. I.e. development is now only supported on Xcode 4.2 and later
using the LLVM compiler.
The VirtualEnvironment class and the various VEObject classes have been modified
so that it is no longer necessary to manually edit VirtualEnvironment.m to match
the VEObject types in use - just drop the required VEObject subclasses into your
project, and they will automatically register with VirtualEnvironment.
The ARMarker class has been factored into a base class representing all types of
ARToolKit markers, and a subclass VEObjectSquare for normal square ARToolKit
markers (pictorial or barcode). This is to allow other types of markers to be
handled within the same app design, e.g. in ARToolKit NFT for iOS, a new class
ARMarkerNFT is now also available.
NFT release 5.0 (2012-04-27).
-----------------------------
- This release updates the included ARToolKit for iOS to version 4.5.9,
corresponding to iOS release 8.0.
Release 8.0 release notes (2012-04-17).
---------------------------------------
A bug in tear-down of the camera connection introduced in release 7.0 has been
corrected in this release. You should substitute release 8.0 for release 7.0 in
your own projects. See the ChangeLog for other details.
NFT release 4.0 (2012-03-28).
-----------------------------
- This release updates the included ARToolKit for iOS to version 4.5.8,
including support for iPad (March 2012) and a bug fix for OSG-builds under Xcode
4.3.
Release 7.0 release notes (2012-03-28).
---------------------------------------
This release incorporates some non-backwards compatible changes to iOS video
handling. If you have been using a previous release based on the
ARViewController class, you'll need to make the following changes:
In your app's ARViewController.m, find the line:
    [cameraVideo setTookPictureDelegateUserData:[cameraVideo bufDataPtr]];
and change it to:
    [cameraVideo setTookPictureDelegateUserData:NULL];
In your app's ARViewController.m, find the lines:
    - (void) cameraVideoTookPicture:(id)sender userData:(void *)data
    {
        [self processFrame:data];
    }
and change them to:
    - (void) cameraVideoTookPicture:(id)sender userData:(void *)data
    {
        AR2VideoBufferT *buffer = ar2VideoGetImage(gVid);
        if (buffer) [self processFrame:buffer->buff];
    }
    }
Camera handling has been improved, but users will notice now that the "Iris"
image appears for longer when starting the ARViewController. This is because the
underlying call to arVideoOpen now waits until the camera has actually started
supplying video frames. The delay until the first frame is actually processed is
correspondingly shortened.
Note that when a new iOS device is released, and users run your app on it,
ARToolKit cannot recommend a set of calibration parameters for the device's
camera, and will fall back to looking for the default calibration filename
"camera_para.dat", so make sure you supply this file. It has been added into the
Data2 folder of all ARToolKit iOS examples.
Release 6.0 release notes (2011-11-17).
---------------------------------------
This release addresses a single issue with release 5.0 in which the pre-built
libraries were missing the armv6 architecture.
Release 5.0 release notes (2011-10-18).
---------------------------------------
This release is primarily to add iPhone 4S support. Additionally, a number of
memory leaks in the iOS-specific code have been corrected. It is recommended
that users run a diff (e.g. using FileCompare) with their current code to see
the changes.
Release 4.0 release notes (2011-08-09).
---------------------------------------
Improvements:
- Added MovieVideo class to libARvideo on iOS. All iOS apps must now link
against AudioToolbox.framework. A new parameter AR_VIDEO_PARAM_IOS_ASYNC can be
queried to find out if frames are delivered asynchronously (CameraVideo) or must
be fetched by polling (MovieVideo).
- Added a new VEObject subclass, VEObjectMovie, which uses the MovieVideo class.
It allows playback of MPEG4 video (with or without audio) in the virtual
environment.
- Added a new example ARAppMovie, demonstrating the most basic VEObjectMovie
functionality.
- The CameraVideo.h and MovieVideo.h headers are now public (look in
include/AR/sys), and this allow cleaner use of CameraVideo methods in
ARViewController (in all iOS examples).
- Added explanatory notes to the ARView and ARMarker headers.
- Changed IPHONEOS_DEPLOYMENT_TARGET to 4.1 (it might still work OK on 4.0, but
Apple's docs are rather vague about some of the API!)
- Modified gsub_es to allow use of arbitrary buffer sizes, and changing of
buffer size on the fly.
- If CameraVideo cannot find a known model match, it now at least provides
generic model info. This should help things work better with future iOS devices.
- The included OSG frameworks have been updated to version 3.1.0.
- The VEObject class and all pre-provided subclasses now support the concept of
a "local transform". This is applied after the object's normal pose transform.
Typically the pose transform is used to set the modelview matrix for the object,
so the local transform allows translation, rotation, scaling or any combination.
Bug fixes:
- Change default threshold mode back to MANUAL. Auto modes have proven
unreliable, and are being improved.
- Fixed a bug whereby the video background would be rendered incorrectly if
other rendering code left multitexturing enabled.
Release 3.0 release notes (2011-06-28).
---------------------------------------
Improvements:
- All targets have been switched to the LLVM compiler, resulting in some
performance improvement.
- A set of shared Xcode 4 build schemes have been added. These are set to build
the "Release" configuration for Xcode's "Run" action. If debugging, change the
active configuration in Xcode's scheme editor to "Debug".
- The included OSG frameworks have been updated to version 2.9.16. Note that the
name libosgUtils.a was corrected to libosgUtil.a, many headers were changed and
some added, and additionally Apps using libARosg should add ImageIO.framework to
their target.
- A new video configuration token "-flipv" has been added to allow flipping of
the video stream on the vertical axis of the iOS device (that is, the long axis
of the device). On front mounted-cameras, this option is on by default (but can
be overridden by specifying -noflipv). Flipping the video vertically matches the
orientation of the video image (when displayed on the screen of the device) to
the physical orientation of the device. See
http://www.artoolworks.com/support/library/
Configuring_video_capture_in_ARToolKit_Professional#AR_VIDEO_DEVICE_IPHONE for
more information.
Bug fixes:
- A critical bug affecting identification of iPad 2 models that include 3G
wireless has been corrected. All users of Release 2.0 deploying to iPad 2
devices are urged to update their applications with libARvideo.a included in
this release.
Release 2.0 release notes (2011-04-20, 2011-05-05).
---------------------------------------------------
Improvements:
- Marker-related code is now contained in a new ARMarker class. This and the
ARView class now use NSNotification to communicate with the VirtualEnvironment
class. The former markers.h/markers.c files have been removed, and the format of
the markers.dat file has also changed. This file no-longer contains links to
models; rather, the models link back to markers.
- Full support for filtering of marker poses is now included in the ARMarker
class. Filtering helps remove unwanted high-frequency "jittering" or
"oscillating" of marker poses when the marker pose is poorly conditioned
(typically when the marker is a long distance and/or perpendicular to the camera
line-of-sight). Filtering can be very easily enabled by adding a line "FILTER 1"
to a marker definition in the markers.dat file.
- VirtualEnvironment objects are now separated out into a new VEObject class and
subclasses, which allows for greater customisation of behaviour.
- The arOSG library (which uses OpenSceneGraph) is now fully supported on iOS.
This will allow for loading of models including transform-based animation,
particle effects, custom lighting, and more. More usage notes on use of the OSG
renderer on iOS are available at
http://www.artoolworks.com/support/library/
ARToolKit_for_iOS_libARosg_Release_Notes.
- A new example named ARAppOSG demonstrates use of .osg and .ive files via the
VEObjectOSG class.
- The ARApp2 example has been refactored to use the new ARMarker and VEObjectOBJ
classes. For users who are satisfied with the rendering facilities provided by
ARApp2, it's ongoing use is recommended, as it produces smaller binaries and
fewer memory-related pressures in app design in comparison with ARAppOSG.
- Full support for the iPad 2 has been added to all examples, and the examples
now build as single universal binaries which will load on either iPhones, iPod
touches, or iPads.
- Users can now choose between rear (main) and front cameras, on devices which
have more than one camera, and additionally, can request different resolution
image data from the cameras. Configuration of these options is performed by use
of named parameters to the arVideoOpen() function (in the -start method of the
ARViewController class). See
http://www.artoolworks.com/support/library/
Configuring_video_capture_in_ARToolKit_Professional#AR_VIDEO_DEVICE_IPHONE for
allowable options.
- Camera calibration files are now supplied for specific iOS device models,
including specific calibrations for different focus-distances of the iPhone 4
rear (main) camera, and front cameras on the iPhone 4, iPod touch 4G, and iPad
2. Loading of these camera parameters has also been simplified by addition of
support to libARvideo for requesting the name of a recommended camera parameter
file. If using this facility in your own projects, be sure to include ALL the
camera_para*.dat files included with the example projects, and don't rename
them. Look at the -start method of the ARViewController class for example usage.
Bug fixes:
- A bug whereby no OpenGL content was rendered (a black screen was seen) on
second and subsequent calls to ARViewController's -start method has been fixed.
The fix is a single line (glStateCacheBeginAgain()) in ARView's -init method.
See
http://www.artoolworks.com/community/forum/viewtopic.php?f=22&t=1110&start=15#
p2328 for more info.
- Handling of tearing-down of the ARViewController when it's view is removed
from the active window has been fixed. See
http://www.artoolworks.com/community/forum/viewtopic.php?f=22&t=1110#p1909 for
more info.
- A bug in rendering transparency in Wavefront .obj files has been fixed.
- A bug in handling of badly-formed Wavefront .mtl files has been fixed.
Other notes:
- Unfortunately, Apple's App Store policies are such that iPhone OS v3.1 support
is no longer able to be offered for projects requiring distribution via the App
Store.
Release 1.0 release notes (2010-09-06).
---------------------------------------
Due to the vagaries of Apple's App Store review process, we have temporarily
withdrawn the support for iPhone OS 3.1 which had been provided during the beta
releases. We hope to be able to restore this support in future releases, subject
to some assurances from Apple regarding their App Store review processes. This
change has however had a positive side-effect, which is to simplify the code
paths in the example code. If you already have an App accepted to the App Store
using beta 2.1, and wish to continue to provide 3.1 support, please contact us
for support.
If you have been developing using the 2.1 beta, it is advisable to compare the
example code between this 1.0 release and the beta release in order to see what
code you should change in your own program. A good way to do this is to use the
three-way compare mode of Apple's FileMerge tool. See the instructions at
http://www.artoolworks.com/support/library/
Updating_an_AR_application_with_the_latest_ARToolKit_for_iOS_example_code.
More notes:
- With more iOS-based devices becoming enabled for AR, we have renamed thee kit
to ARToolKit for iOS.
- This release updates the base ARToolKit libraries to ARToolKit Professional
v4.4.3. If you have been developing with the beta releases, you should replace
the ARToolKit headers and library files in your own project with the versions
included in this release.
- A special camera calibration file for the iPhone 4 has been included.
- Support for the iPhone 4's high-resolution display has not yet been included.
We expect to provide this in a future release.
- Preliminary experimental support for the iPod Touch 4G has been provided,
however we do not recommend releasing an app targeting the iPod Touch 4G without
prior testing, as further camera calibration may be required.
- The auto-thresholding features from ARToolKit v4.4.2 and 4.4.3 are enabled by
default on iOS 4.0. If you wish to switch to manual thresholding, locate the
line "//arSetLabelingThreshMode(gARHandle, AR_LABELING_THRESH_MODE_MANUAL)" in
the -start method of ARViewController and remove the comments. Alternately,
manually change the threshold value using arSetLabelingThresh(), as this forces
manual mode.
Next steps.
-----------
We have made a forum for discussion of AR for iOS development available on our
community website.
http://www.artoolworks.com/community/forum/viewforum.php?f=22
You are invited to join the forum and contribute your questions, answers and
success stories.
ARToolKit consists of a full ecosystem of products for desktop, web, mobile and
in-app plugin augmented reality. Stay up to date with information and releases
from ARToolworks by joining our announcements mailing list.
http://www.artoolworks.com/Announcements_mailing_list.html
Notes on the ARApp design.
--------------------------
ARApp is designed specifically for the iOS environment, and so differs somewhat
from the design that experienced ARToolKit users might expect. It respects the
iOS's model-view-controller design pattern. Calls to ARToolKit's C functions
(part of the "model") are made entirely by the ARViewController class.
ARViewController is a subclass of UIViewController, and is designed to be able
to be reused in your own applications. When instantiated, it opens a video
window, and creates an OpenGL rendering context. When dismissed, it disposes of
the OpenGL context and closes the video window.
As provided, the ARApp example includes a MainWindow NIB file which includes an
instance of the ARViewController class. The application delegate and the
ARViewController are connected in the NIB. You can easily modify this design,
e.g. to load a different view controller when the application is opened. You
could then instantiate ARViewController via a NIB or in code.
The OpenGL drawing code is contained within the ARView class and its superclass
EAGLView. EAGLView is based on Apple sample code and will be familiar to
experienced iOS OpenGL programmers. ARView extends the EAGLView class to provide
important functionality for the AR environment, including compositing the OpenGL
drawing together with the camera image.
The contents of the virtual environment are abstracted into the
VirtualEnvironment class (instantiated by the view controller) and the VEObject
class and its subclasses.
 * VEObject:      Root class representing behaviour of a single object in the
		  virtual environment. Includes methods for updating the pose
		  (position and orientation) of the object, setting a "local"
          pose for the object, testing and setting object visibility and other
          properties.
 * VEObjectOBJ:   A subclass which represents a drawable Wavefront .OBJ model
		  file. It registers for drawing notifications from the
		  ARView and draws the model (using the Eden glm code).
 * VEObjectOSG:   Similar to VEObjectOSG, it allows models to be loaded and
		  drawn using the OpenSceneGraph framework. Connects to OSG
		  via the arOSG library. Supported file types are .osg and .ive.
		  Models may include transform-based animation, particle effects,
		  custom lighting, and more. More usage notes on use of the OSG
		  renderer on iOS are available at http://www.artoolworks.com/support/
		  library/ARToolKit_for_iOS_libARosg_Release_Notes. 
 * VEObjectMovie: Loads and draws a movie file from the local file system as a
		  video texture, using the MovieVideo class. It allows playback
		  of MPEG4 video (with or without audio) in the virtual environment.
	   	  Recommended maximum movie size is 512 pixels or less in both
		  the vertical and horizontal dimensions.
The virtual environment connects to the OpenGL drawing and the ARToolKit
tracking  using NSNotifications generated by the ARView and ARMarker classes.
These classes uses NSNotifications to tell the virtual environment when to
update object poses with newly-processed marker data, and when to draw the
updated objects.
The ARMarker class includes a class method to read marker definitions from a
file (markers.dat) and to instantiate ARMarker instances. Full support for
filtering of marker poses is included in the ARMarker class. Filtering helps
remove unwanted high-frequency "jittering" or "oscillating" of marker poses when
the marker pose is poorly conditioned (typically when the marker is a long
distance and/or perpendicular to the camera line-of-sight). Filtering can be
very easily enabled by adding a line "FILTER 1" to a marker definition in the
markers.dat file.
If you have any further questions about the design, or how the app fits
together, please ask on the community forum (rather than by email) so that
others can benefit from the answers.
About the camera.
-----------------
You can choose between rear (main) and front cameras, on devices which have more
than one camera, and additionally, can request different resolution image data
from the cameras. Configuration of these options is performed by use of named
parameters to the arVideoOpen() function (in the -start method of the
ARViewController class).
The video configuration token "-flipv" allows flipping of the video stream on
the vertical axis of the iOS device (that is, the long axis of the device). On
front mounted-cameras, this option is on by default (but can be overridden by
specifying -noflipv). Flipping the video vertically matches the orientation of
the video image (when displayed on the screen of the device) to the physical
orientation of the device.
See
http://www.artoolworks.com/support/library/
Configuring_video_capture_in_ARToolKit_Professional#AR_VIDEO_DEVICE_IPHONE for
allowable options.
Camera calibration information holds the lens model needed to get ARToolKit
tracking working properly. This is pre-supplied for all current supported iOS
device models, including specific calibrations for different focus-distances of
the iPhone 4 and 4S rear (main) camera, and front cameras on the iPhone 4 and
4S, iPod touch 4G, iPad 2, and iPad (March 2012). With release 10 (ARToolKit
version 4.5.11), loading of these camera parameters has also been simplified by
addition of support to libARvideo for directly requesting the camera parameter
structure. See the release 10 release notes and look at the -start method of the
ARViewController class for example usage.
--
EOF
Views
Personal tools