This is a simple CoreML sample project to detect trained data set that current process 10 animals.
- Dog
- Cat
- Butterfly
- Horse
- Elephant
- Squirrel
- Cow
- Spider
- Sheep
- Chicken
Xcode, Swift, ARKit, Vision, CoreML
Make sure you have the following tool: a Macbook, XCode 12+(download from the AppStore if you don't have one), a Physical device(iPhone 7 and above). Folllow the steps below after you've confrim Xcode on your Macbook:
- Clone the project
- Open the
.xcodeproj
file on Xcode - Connect your iPhone to your commputer (https://www.twilio.com/blog/2018/07/how-to-test-your-ios-application-on-a-real-device.html)
- Click Run
You can now test the app on a physical device.
If you place the app infront of an object, every 0.5 seconds ARSCNView
captures it's current frame, and process a request using a custom VNCoreMLModel
called AnimalClassifierModel
a custom image classifier that has trained data set for the 10 animals. If the current captured frame is a picture of an animal, it automatically presents a new screen with the captured image, a caption with percentage of accurancy.
Note:- The detection accuracy is not currently perfect as you'll find case where it detects a random object as butterfly or any other animal in the sample.