배움/Azure
[Azure] DP-100 자격증 Build AI solutions with Azure Machine Learning - Part 1
하냐NYA
2021. 3. 23. 11:38
- Azure Machine Learning SDK
- Scalable, on-demand compute for machine learning workloads
- Data storage and connectivity to ingest data from a wide range sources
- ML workflow orchestration to automate model training, deployment, and management processes
- Model registration and management to track multiple versions of models and the data on which they were trained
- Metrics and monitoring for training experiments, datasets, and published services
- Model deployment for real-time and batch inferencing
- Azure Machine Learning Workspaces - A context for the experiments, data, compute targets, and other assets associated with a machine learning workload.
- Creating a Workspace with Python and Azure Command Line Interface (CLI)
---Python---
from azureml.core import Workspace
ws = Workspace.create(name='aml-workspace',
subscription_id='123456-abc-123...',
resource_group='aml-resources',
create_resource_group=True,
location='eastus'
)
---CLI---
az ml workspace create -w 'aml-workspace' -g 'aml-resources'
- Logging - From each run of the experiment, and be able to retrieve them easily from each run.
- Azure ML SDK: Metric - run.log(name, val)
- Python printing/logging: Log - print(val) or logging.info(message)
- OpenCensus Python: Log - logger.addHandler(AzureLogHandler()) or logging.log(message)
- ScriptRunConfig - To run a script-based experiemnt that trains a machine learning model.
- Script Arguments - Use a library such as "argparse" to set the regularization rate hyperparameter for an algorithm used to train a model.
- Registering a Model - To track multiple versions of a model, and retrieve models for inferencing (prodicting label values from new data). Specify a name, description, tags, framework, framework version, custom properties, and other useful metadata.
- Datastore - Abstractions for cloud data sources. To add a datastore to your workspace, you can register it using the graphical interface in Azure Machine Learning studio.
- Azure Storage (blob and file containers)
- Azure Data Lake stores
- Azure SQL Database
- Azure Databricks file system
- Tubular Dataset - Provide the easiest way to consume structured data as a Pandas dataframe.
- Environment for the script - Include all packages on which the script depends.
- Compute Target - Environment will be deployed and the script run
- Local Compute - Pysical workstation or a virtual machine such as an Azure Machine Learning compute instance.
- Compute Clusters - For experiment workloads with high scalability requirements.
- Attached Compute - If Azure-based compute environment for data science is already used, attach it to your Azure Machine Learning workspace.