Use other services for machine learning
- Skill 4.1: Build and use neural networks with the Microsoft Cognitive Toolkit
- Skill 4.2: Streamline development by using existing resources
- Skill 4.3: Perform data science at scale by using HDInsight
- Skill 4.4: Perform database analytics by using SQL Server R Services on Azure
- Thought experiment
- Thought experiment answers
- Chapter summary
This thought experiment allows you to demonstrate the skills and knowledge gained by reviewing the topics covered in the chapter. The answers are included in the next section.
Answer the following questions for your manager:
You need to detect manufacturing errors from photos of factory-produced parts. You have decided to use a deep convolutional neural network because you are going to work with images. What activation would you preferably use in your network: Sigmoid, tanh, or ReLU? Can you use sigmoids, tanhs, and ReLUs in Net#?
You are working for a hospital and they ask you to perform two tasks. Assuming that for both problems you have about 100k examples, to which of these two problems deep learning applies best?
Classify tumors into benign or malignant depending on the size and shape of the tumor and the age and sex of the patient.
Classify images of skin lesions as benign lesions or malignant skin cancers.
You are collaborating in the development of an application, and your task is to provide a REST service that classifies images. Order the following possible solutions from the least costly to the most costly from a development point of view.
Use Cortana Vision APIs to classify images.
Within a Data Science Virtual Machine, train a convolutional neural network using CNTK.
Define a convolutional network on Azure ML using Net#.
You must create a system capable of analyzing in real time the clicks that users make on a website with a lot of traffic. Due to the large number of users that the system must support, you decide to use a cluster. What kind of cluster is best suited for creating real-time systems?
You are using Spark and you have a Spark SQL DataFrame named df. From a Jupyter Notebook you make a SQL SELECT to this DataFrame:
%%sql SELECT * FROM df
After executing that cell of code you get an error that says that the table does not exist. What have you forgotten?
You need to periodically analyze the data in one of the tables of an SQL Server. You decide to analyze them by reusing an R script you already have written. To easily schedule the execution of the script you decide to put it in a stored procedure. What is a required step to execute R and Python scripts from a TSQL Stored procedure?
Define the input and output variable names.
Encapsulate the procedure call in another custom procedure.
Enable external scripts execution in the SQL Server engine.
Execute the full R and Python script from the command line prior to executing it form TSQL.