AcroMace / BabelCamera

Find out how to describe the things around you in another language with Core ML and the Vision framework in iOS 11! πŸ‘€

Home Page:https://youtu.be/_jxgNvIRpVk

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

BabelCamera

Build Status DUB

Find out how to describe the things around you in another language!

An iOS app using Core ML and the Vision framework in iOS 11 as well as the Google Translate API.

Demo

BabelCamera demo gif

The app also reads the words aloud

Running

  1. Get a Google Translate API Key
  2. Copy Keys.example.xcconfig to Keys.xcconfig and replace YOUR_API_KEY_HERE in the file with your API key from step 1
  3. Clean build folder / clear derived data (βŒ₯β‡§βŒ˜K)
  4. Run the app

Vision model

The app currently comes with SqueezeNet. You can replace it with another model available on the Apple's Core ML page (ex. ResNet50, Inception v3, VGG16) by dragging the .mlmodel file (ex. VGG16.mlmodel) into Xcode where you see SqueezeNet.model, and then:

In VisionService.swift, replace the line:

model = try? VNCoreMLModel(for: SqueezeNet().model)

with

model = try? VNCoreMLModel(for: VGG16().model)

or the model of your choice

Languages currently supported

These are languages that can both be translated by Google Translate and pronounced by iOS (listed in Language.swift)

  • Arabic
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Czech
  • Danish
  • Dutch
  • English
  • Finnish
  • French
  • German
  • Greek
  • Hewbrew
  • Hindi
  • Hungarian
  • Indonesian
  • Italian
  • Japanese
  • Korean
  • Norwegian
  • Polish
  • Portuguese
  • Romanian
  • Russian
  • Slovak
  • Spanish
  • Swedish
  • Thai
  • Turkish

About

Find out how to describe the things around you in another language with Core ML and the Vision framework in iOS 11! πŸ‘€

https://youtu.be/_jxgNvIRpVk

License:MIT License


Languages

Language:Swift 96.7%Language:Shell 2.8%Language:Ruby 0.6%