How To Implement Azure Face API Using Visual Studio 2019

How To Implement Azure Face API Using Visual Studio 2019

In this Azure tutorial, we will discuss How To Implement Azure Face API Using Visual Studio 2019. Along with this, we will also discuss a few other topics like Azure Face API Example C#, Creating Microsoft Azure Face API using the Azure Portal and we will also discuss Creating a WPF application using Visual Studio 2019 and C# to implement the Azure Face API, Detection The Face In An Image Azure Face API And C#, Microsoft Face API Tutorial, Azure Face API Pricing.

How To Implement Azure Face API Using Visual Studio 2019? Follow the below steps to implement the Face API using Visual Studio 2019

  1. Create an Azure Face API on Azure Portal
  2. Create a WPF application using C# in Visual Studio 2019

We will discuss the implementation in details below.

How To Implement Azure Face API Using Visual Studio 2019

Well, here we will discuss an Implementation Of Face API Using C# in Visual Studio 2019. As part of the implementation, we will perform the below functionalities here.

  • Creating Microsoft Azure Face API using the Azure Portal.
  • Creating a WPF application using Visual Studio 2019 and C# to implement the Face API. As part of the implementation, It will detect the face in an image and will show a red frame around each face and also if you will mouse over on the detected face in the image, it will show you all the attributes like Gender, Age, Emotion, Glasses, etc.

Azure Face API Example C#

Well, it is an interesting topic. Before starting the actual implementation, We should know the prerequisites to start the actual development.

Prerequisites

Below are the prerequisites for Implementing Azure Face API Using Visual Studio 2019.

  • You must have an Azure Valid Subscription or a valid Azure Account. If you don’t have the Azure Account till now, Create an Azure Free account now.
  • You must have Visual Studio 2019 installed in your dev machine. If you don’t have Visual Studio 2019 installed in your machine till now. Install Visual Studio 2019 in your dev machine now.

Creating Microsoft Azure Face API using the Azure Portal

Follow the below steps to create Microsoft Azure Face API using the Azure Portal.

Assuming that you have a Valid Azure Account or Azure Subscription, now let’s start creating the Azure Cognitive Services Face API On Azure Portal.

Login to the Azure Portal (https://portal.azure.com/)

Once you logged in to the Azure Portal, Now from the left side menu, click on the + Create a Resource button as high lighted below.

Creating The Azure Face API On Azure Portal

Now for the next steps, follow my article to Create the Azure Face API on the Azure Portal.

Assuming that you have created the Azure Cognitive Services Azure Face API on the Azure Portal following the above article. Now my Azure Face API is ready. You can see it below.

How to Create The Face API Azure Portal

Once you have created the Azure Face API. The next step is to copy the Key value of the Azure Face API. To copy that, Navigate to the Face API page, click on the Keys and Endpoint from the left navigation and now, you can able to see Key1 and Key2. you can copy the value of the Key1 and keep it in a notepad. 

You can click on the Copy button as highlighted to copy the key value of the Key1. This key-value we need to use while creating the WPF application using Visual Studio 2019 to implement the Azure Face API in the next section.

creating Azure Cognitive Services Face API

Now our first step is completed, we have our Azure Face API is ready now. We have also copied the Key value of our Azure Face API and kept in a notepad. Now we will move to the next step i.e Creating a WPF application using Visual Studio 2019 and C#.

Creating a WPF application using Visual Studio 2019 and C#

Follow the below steps to create a WPF application using Visual Studio 2019 and C#.

Open the Visual Studio 2019 in your dev machine

Click on the Create a new Project button on the Getting Started window.

Choose the project template as the WPF APP (.NET Framework) and then click on the Next button.

Creating a WPF application using Visual Studio 2019

On the Configure your new project window, provide the below details

  • Project Name: Provide a name for your WPF application
  • Location: Choose a location where you want to save your WPF application.
  • Framework: Select the latest .Net framework. As of now, the latest version of the .Net framework is .Net Framework 4.7.2 version.

Finally, click on the Create button to create the new project.

Face recognition in c# windows application

Now, you can able to see the project got created successfully with out any issue.

face recognition in c# WPF application

Detection The Face In An Image Azure Face API And C#

Once the project got created successfully, the next step is, we will have to add two NuGet packages to your Project. To add the NuGet Packages,

Right click on the Project and then click on the Manage NuGet Packages option as shown below.

How to implement face recognition in c# WPF application

Now click on the Browse tab, search for the Newtonsoft.json NuGet package and select the package and then click on the Install button to install the Newtonsoft.json NuGet package.

c# code for face detection

In the same way, we need to add one more NuGet package i.e Torutek.Microsoft.ProjectOxford.Face NuGet package. To install that search for the Torutek.Microsoft.ProjectOxford.Face and select the NuGet package and then click on the Install button.

Implement Face API Using Visual Studio 2019

Now we have installed the required Nuget Package. Now the time to add the code to implement the main functionality.

Open the MainWindow.xaml and add the below code

Note: Make sure to change the project name or class name as per yours on the first line.

<Window x:Class="WpfAppFaceAPI.MainWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
        xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
        xmlns:local="clr-namespace:WpfAppFaceAPI"
        mc:Ignorable="d"
        Title="MainWindow" Height="700" Width="960">
    <Grid x:Name="BackPanel">
        <Image x:Name="MyFace" Stretch="Uniform" Margin="0,0,0,55" MouseMove="MousePointer" />
        <DockPanel DockPanel.Dock="Bottom">
            <Button x:Name="BrowseButton" Width="79" Height="25" VerticalAlignment="Bottom" HorizontalAlignment="Left"  
                    Content="Upload Image"  
                    Click="BrowseButton_Click" />
            <StatusBar VerticalAlignment="Bottom">
                <StatusBarItem>
                    <TextBlock Name="statusBar" />
                </StatusBarItem>
            </StatusBar>
        </DockPanel>
    </Grid>
</Window>

Now the next change is for the MainWindow.xaml.cs file. Add the below code in your MainWindow.xaml.cs file. Make sure to Change the name space or class name as per yours.

using Microsoft.ProjectOxford.Common.Contract;
using Microsoft.ProjectOxford.Face;
using Microsoft.ProjectOxford.Face.Contract;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;

namespace WpfAppFaceAPI
{
    /// <summary>
    /// Interaction logic for MainWindow.xaml
    /// </summary>
    public partial class MainWindow : Window
    {
        private readonly IFaceServiceClient faceServiceClient =
            new FaceServiceClient("191d05e8f6xxxxxxxxcb4fc9373", "https://eastus.api.cognitive.microsoft.com/face/v1.0/");

        Face[] facesDetected;                    
        String[] faceDesc;       
        double factresize;
        public MainWindow()
        {
            InitializeComponent();
        }

        private async void BrowseButton_Click(object sender, RoutedEventArgs e)
        {
            var openDialog = new Microsoft.Win32.OpenFileDialog();

            openDialog.Filter = "JPEG Image(*.jpg)|*.jpg";
            bool? rslt = openDialog.ShowDialog(this);

            
            if (!(bool)rslt)
            {
                return;
            }

          
            string imagePath = openDialog.FileName;

            Uri imageUri = new Uri(imagePath);
            BitmapImage src = new BitmapImage();

            src.BeginInit();
            src.CacheOption = BitmapCacheOption.None;
            src.UriSource = imageUri;
            src.EndInit();

            MyFace.Source = src;

         
            Title = "Detecting the Faces...";
            facesDetected = await UploadImageFaces(imagePath);
            Title = String.Format("Finished Detecting the Faces. {0} face(s) detected", facesDetected.Length);

            if (facesDetected.Length > 0)
            {
               
                DrawingVisual v = new DrawingVisual();
                DrawingContext dc = v.RenderOpen();
                dc.DrawImage(src,
                    new Rect(0, 0, src.Width, src.Height));
                double dpi = src.DpiX;
                factresize = 96 / dpi;
                faceDesc = new String[facesDetected.Length];

                for (int i = 0; i < facesDetected.Length; ++i)
                {
                    Face face = facesDetected[i];

                    // Logic to Draw Rectangle Shape 
                    dc.DrawRectangle(
                        Brushes.Transparent,
                        new Pen(Brushes.Red, 2),
                        new Rect(
                            face.FaceRectangle.Left * factresize,
                            face.FaceRectangle.Top * factresize,
                            face.FaceRectangle.Width * factresize,
                            face.FaceRectangle.Height * factresize
                            )
                    );
  
                    faceDesc[i] = Description(face);
                }

                dc.Close();

              
                RenderTargetBitmap frb = new RenderTargetBitmap(
                    (int)(src.PixelWidth * factresize),
                    (int)(src.PixelHeight * factresize),
                    96,
                    96,
                    PixelFormats.Pbgra32);

                frb.Render(v);
                MyFace.Source = frb;

                
                statusBar.Text = "You can Place the mouse pointer over any face to see that face description in details.";
            }
        }
        private void MousePointer(object sender, MouseEventArgs e)
        {
             
            if (facesDetected == null)
                return;

             
            Point ms = e.GetPosition(MyFace);

            ImageSource imageSource = MyFace.Source;
            BitmapSource bitmapSource = (BitmapSource)imageSource;

             
            var scale = MyFace.ActualWidth / (bitmapSource.PixelWidth / factresize);

            
            bool mouseOnFace = false;

            for (int i = 0; i < facesDetected.Length; ++i)
            {
                FaceRectangle fr = facesDetected[i].FaceRectangle;
                double left = fr.Left * scale;
                double top = fr.Top * scale;
                double width = fr.Width * scale;
                double height = fr.Height * scale;

               
                if (ms.X >= left && ms.X <= left + width && ms.Y >= top && ms.Y <= top + height)
                {
                    statusBar.Text = faceDesc[i];
                    mouseOnFace = true;
                    break;
                }
            }

            
            if (!mouseOnFace)
                statusBar.Text = "Place the mouse pointer over a face to see the face description.";
        }

        

        private async Task<Face[]> UploadImageFaces(string imageFilePath)
        {
            // All the Face attributes
            IEnumerable<FaceAttributeType> attributes =
                new FaceAttributeType[] { FaceAttributeType.Gender, FaceAttributeType.Age, FaceAttributeType.Smile, FaceAttributeType.Emotion, FaceAttributeType.Glasses, FaceAttributeType.Hair };

            
            try
            {
                using (Stream imageFileStream = File.OpenRead(imageFilePath))
                {
                    Face[] myfaces = await faceServiceClient.DetectAsync(imageFileStream, returnFaceId: true, returnFaceLandmarks: false, returnFaceAttributes: attributes);
                    return myfaces;
                }
            }
           
            catch (FaceAPIException f)
            {
                MessageBox.Show(f.ErrorMessage, f.ErrorCode);
                return new Face[0];
            }
             
            catch (Exception e)
            {
                MessageBox.Show(e.Message, "Error");
                return new Face[0];
            }
        }

        

        private string Description(Face face)
        {
            StringBuilder sb = new StringBuilder();

            sb.Append("Face: ");

             
            sb.Append(face.FaceAttributes.Gender);
            sb.Append(", ");
            sb.Append(face.FaceAttributes.Age);
            sb.Append(", ");
            sb.Append(String.Format("smile {0:F1}%, ", face.FaceAttributes.Smile * 100));

            // Display all emotions if it is over 10%.  
            sb.Append("Emotion Level: ");
            EmotionScores emotionScores = face.FaceAttributes.Emotion;
            if (emotionScores.Anger >= 0.1f) sb.Append(String.Format("anger level {0:F1}%, ", emotionScores.Anger * 100));
            if (emotionScores.Contempt >= 0.1f) sb.Append(String.Format("contempt {0:F1}%, ", emotionScores.Contempt * 100));
            if (emotionScores.Disgust >= 0.1f) sb.Append(String.Format("disgust {0:F1}%, ", emotionScores.Disgust * 100));
            if (emotionScores.Fear >= 0.1f) sb.Append(String.Format("fear level {0:F1}%, ", emotionScores.Fear * 100));
            if (emotionScores.Happiness >= 0.1f) sb.Append(String.Format("happiness level {0:F1}%, ", emotionScores.Happiness * 100));
            if (emotionScores.Neutral >= 0.1f) sb.Append(String.Format("neutral {0:F1}%, ", emotionScores.Neutral * 100));
            if (emotionScores.Sadness >= 0.1f) sb.Append(String.Format("sadness level {0:F1}%, ", emotionScores.Sadness * 100));
            if (emotionScores.Surprise >= 0.1f) sb.Append(String.Format("surprise {0:F1}%, ", emotionScores.Surprise * 100));

            
            sb.Append(face.FaceAttributes.Glasses);
            sb.Append(", ");

            
            sb.Append("Hair: ");

              
            if (face.FaceAttributes.Hair.Bald >= 0.01f)
                sb.Append(String.Format("bald {0:F1}% ", face.FaceAttributes.Hair.Bald * 100));

            
            HairColor[] hcolors = face.FaceAttributes.Hair.HairColor;
            foreach (HairColor hairColor in hcolors)
            {
                if (hairColor.Confidence >= 0.1f)
                {
                    sb.Append(hairColor.Color.ToString());
                    sb.Append(String.Format(" {0:F1}% ", hairColor.Confidence * 100));
                }
            }

             
            return sb.ToString();
        }
    }
}

Note: Make sure to change the key as per your Azure Cognitive Services Face API that you have created above and copied the Key1 value in a notepad. The next is to make sure to change the EndPoint URL for your Azure Face API and region selected.

private readonly IFaceServiceClient faceServiceClient =
            new FaceServiceClient("Your Azure FaceAPI key value", "Your Azure FaceAPI EndPoint URL");

Now you can able to see the code changes here as below

How To Implement Face API Using Visual Studio 2019

Now, we are done with the code changes. Now time to run the application and see if the functionality is working as expected. So Press F5 to run the WPF application.

Once you will run the application, you can able to see the below window, click on the Upload Image button to browse the image from your local machine. (Assuming you have a Valid image in your local machine)

How To Implement Face API Using C# Visual Studio 2019

Now it will show your local machine path, you can navigate to the path where you have stored the image and then click on the Open button to open the image.

Now you can able to see them, It is reading the image successfully and identified that there are 3 faces in the given image and it is also able to draw the Rectangle on each face in Red color.

If you can able to see, Next to the Upload Image button, It is showing a message that Place the mouse pointer over a face to see the face description.

How To Implement Face API Using C# Visual Studio

So now, once you will put the mouse pointer in any of the images, you can able to see it is showing all the attribute details like Gender, Age, Emotion, Hair, etc for that specific image next to the Upload Image button as highlighted below.

detection the face in an image azure face api and c#

Microsoft Face API Tutorial

The Microsoft Face API provides an excellent advanced algorithm that helps you to detect or read, Identify the human faces in different digital images. That includes detecting the emotions and facial expressions like happiness, fear, etc.

You can check out an end to end Microsoft Face API Tutorial now.

Azure Face API Pricing

Well, here we will discuss Azure Face API Pricing details. Again one more great thing with the Azure Face API is, it is based on the Pay only for what you use model. You need to pay only for that you have used. There is No termination fees and No upfront cost with this Azure Face API model. Ohk, Let’s discuss the Pricing structure of the Microsoft Azure Face API.

InstanceTransaction/secondFeatures that can be utilizedPrice Details
Web/Container – FreePer minutes 20 transactionsDetection of Face
Verification of face
Identification of a specific face
Grouping of faces
Searching for a similar type of faces
You will get 30,000 transactions free/ month
Web/Container – Standard10 transactions/secondsDetection of Face
Verification of face
Identification of a specific face
Grouping of faces
Searching for a similar type of faces

You need to pay ₹66.097 per 1,000 transactions within 0-1M transactions

You need to pay ₹52.877 per 1,000 transactions within 1M-5M transactions

You need to pay ₹39.658 per 1,000 transactions within 5M-100M transactions

You need to pay ₹26.439 per 1,000 transactions for 100M+ transactions
Face StorageYou need to pay ₹0.661 per 1,000 faces/month.

This is all about the Azure Face API Pricing structure, If you need more information on the Pricing Structure, you can visit the Microsoft Official site.

You may also like following the below Articles

Wrapping Up

Well, In this article, we discussed How To Implement Azure Face API Using Visual Studio 2019, Azure Face API Example C#, Creating Microsoft Azure Face API using the Azure Portal and we also discussed Creating a WPF application using Visual Studio 2019 and C# to implement the Azure Face API, Detection The Face In An Image Azure Face API And C#, Microsoft Face API Tutorial, Azure Face API Pricing. Hope you have enjoyed this article !!!