Here to understand how exactly it works, we will see how to create table storage and do all the operations using Azure Explorer.
How to use Storage Explorer in Azure
Table storage consists of several entities, which are objects with properties. Each entity is related to the partition key, row key, and timestamp.
Partition key
The partition key is a unique identifier and is known as the first part of the primary key used for the partition in a table.
Row key
The row key is called the second part of the primary key. Every entity has one partition key and one-row key, which are used to create the clustered index that helps with the first index for storage.
To create a table, follow the below steps.
1. Select the table node, right-click, and click the Create Table link. Alternatively, click the Create Table link from the Actions section. See the screenshot below.

2. Now, provide a name for the table storage

Now, for the Demo1 table storage, there is no data for the partition and row keys, and there is no data currently.
3. To add the partition key and row key, click on the Add button on the menu and provide the partition key as Employee and the row key as IT.
Add a new property called Name by clicking the Add property button and provide the value as Information Technology.
Then click on the insert button. Check out the screenshot below.

Now you can see one record has been added to the table

Add a few more records to the table

Now, you can add a filter condition to filter the record

Add a resource via Azure AD
Click on the Open Connect Dialog button from the left side

Select the option Add a resource via Azure Active Directory (Azure AD) and click the Next button.

Select an Azure account and tenant. Account must have access to the Storage resource you want to attach to. Select Next.

Choose the resource type, provide the Container URL, and provide the Connection display name. Click on the Next button.
Now, on the next step, review the Connection Summary to make sure all the information is correct. If any information is incorrect, you can edit it again by clicking on the back button. If all the information provided is correct, then you can click on the Connect button.
Once the connection is successfully added, you can see it on Local & Attached –> Storage Accounts –> (Attached Containers) –> Blob Containers.
Use a connection string
Click on the Open Connect Dialog button from the left side

Azure Storage connection strings
Select the Use a connection string option and then click on the Next button

Provide a display name for the connection and enter your connection string. Select the Next button.
Now review the Connection Summary and verify if all the information provided is correct, if not you can click on the Back button and make the changes all the information is correct click on the Connect button.
Once the connection is established successfully, you can see the resource under Local & Attached –> Storage Accounts.
Use a shared access signature (SAS) URI
Click on the Open Connect Dialog button from the left side

Select the Use a shared access signature (SAS) URI option and click the Next button.

Provide a display name and enter your shared access signature URI. The service endpoint for the resource type should be autofill. Click on the Next button.
Now, on the Connection Summary page, verify if all the information provided is correct. If not, you can click the Back button and make the changes. If all the information is correct, click on the Connect button.
The resource is under Local & Attached—> Storage Accounts—> (Attached Containers) > Service node.
Use a storage account name and key
Click on the Open Connect Dialog button from the left side

Select Use a storage account name and key, then click the Next button.

Provide the display name, Account name, access key, choose a Storage Domain, and click the Next button.
Now, on the Connection Summary page, verify if all the information provided is correct. If not, you can click on the Back button and make the changes. If all the information is correct, click on the Connect button.
You can see the resource under Local & Attached –> Storage Accounts.
Attach to a local emulator
Azure Storage Explorer supports two types of official Storage emulators
- Azure Storage Emulator: This is available only for Windows operating systems.
- Azurite is available for Windows, macOS, or Linux operating systems.
If your emulator is not running on the default port, then follow the below steps
You can start your emulator. Enter the command AzureStorageEmulator.exe status.
Click on the Open Connect Dialog button from the left side

Select the Attach to a local emulator option and click the Next button.

Provide a display name and enter the ports your emulator is listening to. Attach to a Local Emulator suggests the default port values for most emulators. You can enter the port to use. Then, select Next.
Now, on the Connection Summary page, verify if all the information provided is correct. If not, you can click the Back button and make the changes. If all the information is correct, click on the Connect button.
You can see the resource under Local & Attached –> Storage Accounts.
Connect to an Azure Cosmos DB using a connection string
We can use a connection string to connect an Azure Cosmos DB.
- Under Local & Attached, right-click Cosmos DB Accounts(Preview), and select Connect to Cosmos DB.
- One more way is under Actions, click on Connect to Cosmos DB.

- Select API, enter your Connection String data, and then select the Next button to connect to the Azure Cosmos DB.
Connect to Azure Data Lake Store by URI
- Under Local & Attached, right-click Data Lake Storage Gen1(Preview), and select Connect to Data Lake Storage Gen1.
- One more way: under Actions, click Connect to Data Lake Storage Gen1.

- On the Connect to Data Lake Store window, provide the ADL URI, and then click on the OK button to connect to the Data Lake Storage.
You may also like following the articles below

I am Rajkishore, and I am a Microsoft Certified IT Consultant. I have over 14 years of experience in Microsoft Azure and AWS, with good experience in Azure Functions, Storage, Virtual Machines, Logic Apps, PowerShell Commands, CLI Commands, Machine Learning, AI, Azure Cognitive Services, DevOps, etc. Not only that, I do have good real-time experience in designing and developing cloud-native data integrations on Azure or AWS, etc. I hope you will learn from these practical Azure tutorials. Read more.
