This sample application demonstrates the usage of Azure Cognitive Services, and is a companion to my session "Why Don't You Understand Me? Build Intelligence into Your Apps".
The sample has been built and tested with Visual Studio 2017 Update 2 and above, and requires the "Mobile Development with .NET" (a.k.a. Xamarin) and "Azure" workloads to be installed. The requirements change according to the desired platform:
- For Android you will need an Android Emulator or an Android device.
- For iOS you will need a connected Mac with an iPhone simulator or connected iPhone.
- For UWP you will need a Windows 10 machine.
In addition, you will need to obtain the following API keys:
-
An API key for the 500px API. Registering an application and obtaining an API key can be done via this page.
-
An API key for each of the Azure Cognitive Services. In order to obtain this key, login to the Azure Portal, create a new Cognitive Services resource of the required API type and copy the API key from the portal.
- Computer Vision API
- Face API
- Emotion API
- Text Analysis API
- Translator Text API
- Bing Autosuggest API
- Bing Speech APIs
All these resources can be automatically deployed to a resource group in your subscription by using the deployment template link below.
-
A training key and prediction key for an Azure Custom Vision service.
Once you have obtained your keys, make a copy of the file src/IntelligentPx/IntelligentPx/Secrets.default.config and rename it to src/IntelligentPx/IntelligentPx/Secrets.config. Then, enter your provided API keys there. At this stage you should be able to compile and run the sample by opening the Solution file src/IntelligentPx.sln.