Compare commits

..

47 Commits

Author SHA1 Message Date
Sourabh Jain
a7e38201c4 refactored code 2025-04-07 21:26:02 +05:30
Sourabh Jain
80edb66fbf wip for private endpoint support 2025-04-04 22:41:45 +05:30
Sourabh Jain
c62e89228e updated code 2025-04-03 11:27:52 +05:30
Sourabh Jain
d68bf02a0e terminal log changes 2025-04-02 07:15:49 +05:30
Sourabh Jain
62e90ed26d vnet automation 2025-04-02 06:51:09 +05:30
Sourabh Jain
1f4e79f856 fix comands 2025-03-25 13:00:28 +05:30
Sourabh Jain
9db1edca03 fix vcore username 2025-03-25 08:04:22 +05:30
Sourabh Jain
8b4eaa95ea ui and functional changes 2025-03-25 00:24:17 +05:30
Sourabh Jain
10b0da2190 Vnet addition 2025-03-22 15:33:56 +05:30
Sourabh Jain
4313d6ecbd enable cloudshell with vnet config enabled 2025-03-18 22:47:31 +05:30
Sourabh Jain
83eafd4485 fix terminals 2025-03-18 17:13:49 +05:30
Sourabh Jain
44e85647e4 add other db support 2025-03-17 06:35:19 +05:30
Sourabh Jain
ec891671b6 code refactor 2025-02-28 07:55:48 +05:30
Sourabh Jain
942de980c3 code refactor 2025-02-25 12:09:31 +05:30
Sourabh Jain
2c3c4e7db7 added consent 2025-02-25 11:23:42 +05:30
Sourabh Jain
9b2cb8a1a9 refactor code and add casandra and mongo commands 2025-02-24 23:24:25 +05:30
Sourabh Jain
41439cc7d4 refactor code 2025-02-24 15:10:07 +05:30
Sourabh Jain
ce08ce05f2 mongo is working 2025-02-21 21:09:44 +05:30
Sourabh Jain
323276beff cloudshell api failed with client id 2025-02-19 07:00:41 +05:30
Sourabh Jain
1678ec0a23 xterm add 2025-02-18 17:11:39 +05:30
Sourabh Jain
0babb1fa13 reverted code 2025-02-18 07:57:30 +05:30
Sourabh Jain
78c8df0904 Not wroking code 2025-02-18 07:53:52 +05:30
Sourabh Jain
76742455bf first draft 2025-02-17 07:12:49 +05:30
bogercraig
2730da7ab6 Backend Migration - Remove Use of Legacy Backend from DE (#2043)
* Default to new backend endpoint if the endpoint in current context does not match existing set in constants.

* Remove some env references.

* Added comments with reasoning for selecting new backend by default.

* Update comment.

* Remove all references to useNewPortalBackendEndpoint now that old backend is disabled in all environments.

* Resolve lint issues.

* Removed references to old backend from Cassandra and Mongo Apis

* fix unit tests

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-12 18:12:59 -08:00
sunghyunkang1111
de2449ee25 Adding throughput bucket settings in Data Explorer (#2044)
* Added throughput bucketing

* fix bugs

* enable/disable per autoscale selection

* Added logic

* change query bucket to group

* Updated to a tab

* Fixed unit tests

* Edit package-lock

* Compile build fix

* fix unit tests

* moving the throughput bucket flag to the client generation level
2025-02-12 13:10:07 -06:00
sunghyunkang1111
99378582ce Remove blocking await on sample database (#2047)
* Remove blocking await on sample database

* Remove compress flag to reduce bundle size

* Fix typo in webpack config comment date
2025-02-12 13:09:52 -06:00
SATYA SB
bd592d07af [accessibility-1217621]: Keyboard focus gets lost on the page which opens after activating "Data Explorer" menu item present under 'Overview' page. (#1927)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-12 11:31:30 +05:30
asier-isayas
644f5941ec Set default throughput based on account's workload type (#2021)
* assign default throughput based on workload type

* combined common logic

* fix unit tests

* add tests

* update tests

* npm run format

* Update ci.yml

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-11 17:47:55 -05:00
jawelton74
9fb006a996 Restore DisplayNPSSurvey message type enum which was removed in a prior (#2046)
change.
2025-02-11 06:58:44 -08:00
jawelton74
c2b98c3e23 Modify E2E cleanup script to use @azure/identity for AZ credentials. (#2051) 2025-02-10 08:48:26 -08:00
Nishtha Ahuja
76d49d86d4 Added emulator checks in settings pane fields (#2041)
* added emulator checks

* created macro

* conditions as const

---------

Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-10 11:52:56 +05:30
Laurent Nguyen
7893b89bf7 Do not open first container if a tab is already open (#2045)
Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2025-02-06 21:58:38 +01:00
JustinKol
5945e3cb6b Removed NPS Survey from DE since it has been moved to the Overview Blade (#2027)
* Removed NPS Survey from DE since it has been moved to the Overview Blade

* Added ExplorerBindings back

* Moved applyExplorerBindings back to original place
2025-02-05 13:30:03 -05:00
Laurent Nguyen
213d1c68fe Remove feature switch on restore tabs (#2039) 2025-02-03 17:59:00 +01:00
Nishtha Ahuja
c26f9a1ebb disabled change buttom for emulator (#2017)
Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-03 12:39:01 +05:30
SATYA SB
bd7cd7ae8f [accessibility-3556793]: [Screen Reader- Azure Cosmos DB- Data Explorer]: The Learn more links are not descriptive present under the settings. (#2035)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:58:44 +05:30
SATYA SB
6504358580 [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings. (#2016)
* [accessibility-3556824] : [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings.

* Snapshots updated.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:53:18 +05:30
SATYA SB
ce88659fca [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button. (#2010)
* [accessibility-3560073]: [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

* [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:48:34 +05:30
SATYA SB
642c708e9c [accessibility-3556756]: [Programmatic Access- Azure Cosmos DB- Data explorer]: Ensures <img> elements have alternate text or a role of none or presentation. (#2007)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:45:49 +05:30
SATYA SB
4156009d09 [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button. (#2002)
* [accessibility-3549715]: [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

* [accessibility-3549715]:[Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:44:32 +05:30
SATYA SB
5c6abbd635 [accessibility-3556595]: [Programmatic Access- Azure Cosmos DB- Data Explorer]: Ensures role attribute has an appropriate value for the element. (#2001)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:37:11 +05:30
jawelton74
881726e9af New preview site (#2036)
* Changes to DE preview site to support managed identity. Changes to
infrastructure to use new preview site.

* Fix formatting.

* Potential fix for code scanning alert no. 56: Server-side request forgery

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* Use different secrets for subscription/tenant/client id's.

* Revert new id names.

* Update Az CLI config.

* Update to Node 18 and update security vulnerable dependencies.

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-01-30 16:14:03 -08:00
jawelton74
7015590d1a Remove hard coded client and subscription Ids from webpack config. (#2033) 2025-01-24 07:23:33 -08:00
jawelton74
1d952a4ea2 Remove throughput survey text and link from Throughput tab. (#2031) 2025-01-21 10:53:47 -08:00
jawelton74
2a81551a60 Use unique names in upload artifacts tasks (#2030)
* Specify actual package names in upload artifacts task.

* Revert path change, use unique names for upload task.

* Fix the right properties.

* Revert condition change
2025-01-21 07:07:53 -08:00
jawelton74
eceee36913 Use azure identity package for e2e test credentials (#2032)
* Update identity package, remove ms-rest-nodeauth package.

* Test changes to use identity package.
2025-01-21 07:07:18 -08:00
jawelton74
96faf92c12 Use dotnet CLI for nuget operations in CI pipeline (#2026)
* Start of moving nuget actions to use dotnet.

* Comment out env section

* Set auth token.

* Disable globalization support.

* Comment out dotnet setup.

* Copy proj file with build.

* PLace Content item under ItemGroup.

* Update project with Sdk and No Build args.

* Remove no-build from cmd line.

* Set TargetFramework version.

* Fix TargetFramework value.

* Add nuget push command.

* Fix test version string

* Add nuget add source step.

* Fix add source args.

* Enable cleartext password, remove source after completion.

* Use wildcard for nupkg path. Add debug.

* Remove debug.

* Fix nupkg path

* Fix API key argument

* Re-enable MPAC nuget. Tidy up ci.yml.

* Fix formatting of webpack config.

* Remove Globalization flag.

* Revert test changes.
2025-01-15 11:37:30 -08:00
120 changed files with 5782 additions and 2321 deletions

View File

@@ -1 +1 @@
[Preview this branch](https://cosmos-explorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true) [Preview this branch](https://dataexplorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true)

View File

@@ -83,7 +83,7 @@ jobs:
- run: npm ci - run: npm ci
- run: npm run build:contracts - run: npm run build:contracts
- name: Restore Build Cache - name: Restore Build Cache
uses: actions/cache@v2 uses: actions/cache@v4
with: with:
path: .cache path: .cache
key: ${{ runner.os }}-build-cache key: ${{ runner.os }}-build-cache
@@ -96,14 +96,16 @@ jobs:
with: with:
name: dist name: dist
path: dist/ path: dist/
- name: "Az CLI login"
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.PREVIEW_SUBSCRIPTION_ID }}
- name: Upload build to preview blob storage - name: Upload build to preview blob storage
run: az storage blob upload-batch -d '$web' -s 'dist' --account-name cosmosexplorerpreview --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true run: az storage blob upload-batch -d '$web' -s 'dist' --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --auth-mode login --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
- name: Upload preview config to blob storage - name: Upload preview config to blob storage
run: az storage blob upload -c '$web' -f ./preview/config.json --account-name cosmosexplorerpreview --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true run: az storage blob upload -c '$web' -f ./preview/config.json --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --auth-mode login --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
nuget: nuget:
name: Publish Nuget name: Publish Nuget
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/') if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -113,21 +115,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }} NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }} AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps: steps:
- uses: nuget/setup-nuget@v2
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder - name: Download Dist Folder
uses: actions/download-artifact@v4 uses: actions/download-artifact@v4
with: with:
name: dist name: dist
- run: cp ./configs/prod.json config.json - run: cp ./configs/prod.json config.json
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT" - run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}" - run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg - run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v4
name: packages name: Upload package to Artifacts
with: with:
path: "*.nupkg" name: prod-package
path: "bin/Release/*.nupkg"
nugetmpac: nugetmpac:
name: Publish Nuget MPAC name: Publish Nuget MPAC
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/') if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -137,22 +139,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }} NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }} AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps: steps:
- uses: nuget/setup-nuget@v2
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder - name: Download Dist Folder
uses: actions/download-artifact@v4 uses: actions/download-artifact@v4
with: with:
name: dist name: dist
- run: cp ./configs/mpac.json config.json - run: cp ./configs/mpac.json config.json
- run: sed -i 's/Azure.Cosmos.DB.Data.Explorer/Azure.Cosmos.DB.Data.Explorer.MPAC/g' DataExplorer.nuspec - run: sed -i 's/Azure.Cosmos.DB.Data.Explorer/Azure.Cosmos.DB.Data.Explorer.MPAC/g' DataExplorer.nuspec
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT" - run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}" - run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg - run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v4
name: packages name: Upload package to Artifacts
with: with:
path: "*.nupkg" name: mpac-package
path: "bin/Release/*.nupkg"
playwright-tests: playwright-tests:
name: "Run Playwright Tests (Shard ${{ matrix.shardIndex }} of ${{ matrix.shardTotal }})" name: "Run Playwright Tests (Shard ${{ matrix.shardIndex }} of ${{ matrix.shardTotal }})"

9
DataExplorer.proj Normal file
View File

@@ -0,0 +1,9 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<NoBuild>true</NoBuild>
<IncludeBuildOutput>false</IncludeBuildOutput>
<NuspecFile>DataExplorer.nuspec</NuspecFile>
<NuspecProperties>version=$(PackageVersion)</NuspecProperties>
</PropertyGroup>
</Project>

View File

@@ -1830,6 +1830,14 @@ input::-webkit-calendar-picker-indicator::after {
transform: rotate(90deg); transform: rotate(90deg);
} }
.customAccordion button:focus {
.focus();
}
.customAccordion {
margin-top: 1px;
}
.datalist-arrow:after:hover { .datalist-arrow:after:hover {
content: "\276F"; content: "\276F";
position: absolute; position: absolute;

522
package-lock.json generated
View File

@@ -12,8 +12,7 @@
"@azure/arm-cosmosdb": "9.1.0", "@azure/arm-cosmosdb": "9.1.0",
"@azure/cosmos": "4.2.0-beta.1", "@azure/cosmos": "4.2.0-beta.1",
"@azure/cosmos-language-service": "0.0.5", "@azure/cosmos-language-service": "0.0.5",
"@azure/identity": "1.5.2", "@azure/identity": "4.5.0",
"@azure/ms-rest-nodeauth": "3.1.1",
"@azure/msal-browser": "2.14.2", "@azure/msal-browser": "2.14.2",
"@babel/plugin-proposal-class-properties": "7.12.1", "@babel/plugin-proposal-class-properties": "7.12.1",
"@babel/plugin-proposal-decorators": "7.12.12", "@babel/plugin-proposal-decorators": "7.12.12",
@@ -52,6 +51,7 @@
"@types/mkdirp": "1.0.1", "@types/mkdirp": "1.0.1",
"@types/node-fetch": "2.5.7", "@types/node-fetch": "2.5.7",
"@xmldom/xmldom": "0.7.13", "@xmldom/xmldom": "0.7.13",
"@xterm/xterm": "5.5.0",
"allotment": "1.20.2", "allotment": "1.20.2",
"applicationinsights": "1.8.0", "applicationinsights": "1.8.0",
"bootstrap": "3.4.1", "bootstrap": "3.4.1",
@@ -428,47 +428,68 @@
"license": "0BSD" "license": "0BSD"
}, },
"node_modules/@azure/identity": { "node_modules/@azure/identity": {
"version": "1.5.2", "version": "4.5.0",
"license": "MIT", "resolved": "https://registry.npmjs.org/@azure/identity/-/identity-4.5.0.tgz",
"integrity": "sha512-EknvVmtBuSIic47xkOqyNabAme0RYTw52BTMz8eBgU1ysTyMrD1uOoM+JdS0J/4Yfp98IBT3osqq3BfwSaNaGQ==",
"dependencies": { "dependencies": {
"@azure/core-auth": "^1.3.0", "@azure/abort-controller": "^2.0.0",
"@azure/core-client": "^1.0.0", "@azure/core-auth": "^1.9.0",
"@azure/core-rest-pipeline": "^1.1.0", "@azure/core-client": "^1.9.2",
"@azure/core-tracing": "1.0.0-preview.12", "@azure/core-rest-pipeline": "^1.17.0",
"@azure/core-tracing": "^1.0.0",
"@azure/core-util": "^1.11.0",
"@azure/logger": "^1.0.0", "@azure/logger": "^1.0.0",
"@azure/msal-node": "1.0.0-beta.6", "@azure/msal-browser": "^3.26.1",
"@types/stoppable": "^1.1.0", "@azure/msal-node": "^2.15.0",
"axios": "^0.21.1",
"events": "^3.0.0", "events": "^3.0.0",
"jws": "^4.0.0", "jws": "^4.0.0",
"msal": "^1.0.2", "open": "^8.0.0",
"open": "^7.0.0",
"qs": "^6.7.0",
"stoppable": "^1.1.0", "stoppable": "^1.1.0",
"tslib": "^2.0.0",
"uuid": "^8.3.0"
},
"engines": {
"node": ">=12.0.0"
},
"optionalDependencies": {
"keytar": "^7.3.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/core-tracing": {
"version": "1.0.0-preview.12",
"license": "MIT",
"dependencies": {
"@opentelemetry/api": "^1.0.0",
"tslib": "^2.2.0" "tslib": "^2.2.0"
}, },
"engines": { "engines": {
"node": ">=12.0.0" "node": ">=18.0.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/msal-browser": {
"version": "3.28.1",
"resolved": "https://registry.npmjs.org/@azure/msal-browser/-/msal-browser-3.28.1.tgz",
"integrity": "sha512-OHHEWMB5+Zrix8yKvLVzU3rKDFvh7SOzAzXfICD7YgUXLxfHpTPX2pzOotrri1kskwhHqIj4a5LvhZlIqE7C7g==",
"dependencies": {
"@azure/msal-common": "14.16.0"
},
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/msal-common": {
"version": "14.16.0",
"resolved": "https://registry.npmjs.org/@azure/msal-common/-/msal-common-14.16.0.tgz",
"integrity": "sha512-1KOZj9IpcDSwpNiQNjt0jDYZpQvNZay7QAEi/5DLubay40iGYtLzya/jbjRPLyOTZhEKyL1MzPuw2HqBCjceYA==",
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/@azure/identity/node_modules/open": {
"version": "8.4.2",
"resolved": "https://registry.npmjs.org/open/-/open-8.4.2.tgz",
"integrity": "sha512-7x81NCL719oNbsq/3mh+hVrAWmFuEYUqrq/Iw3kUzH8ReypT9QQ0BLoJS7/G9k6N81XjW4qHWtjWwe/9eLy1EQ==",
"dependencies": {
"define-lazy-prop": "^2.0.0",
"is-docker": "^2.1.1",
"is-wsl": "^2.2.0"
},
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/@azure/identity/node_modules/tslib": { "node_modules/@azure/identity/node_modules/tslib": {
"version": "2.6.2", "version": "2.8.1",
"license": "0BSD" "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
"integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="
}, },
"node_modules/@azure/logger": { "node_modules/@azure/logger": {
"version": "1.0.4", "version": "1.0.4",
@@ -484,10 +505,6 @@
"version": "2.6.2", "version": "2.6.2",
"license": "0BSD" "license": "0BSD"
}, },
"node_modules/@azure/ms-rest-azure-env": {
"version": "2.0.0",
"license": "MIT"
},
"node_modules/@azure/ms-rest-azure-js": { "node_modules/@azure/ms-rest-azure-js": {
"version": "2.1.0", "version": "2.1.0",
"license": "MIT", "license": "MIT",
@@ -559,15 +576,6 @@
"node": ">=4.0" "node": ">=4.0"
} }
}, },
"node_modules/@azure/ms-rest-nodeauth": {
"version": "3.1.1",
"license": "MIT",
"dependencies": {
"@azure/ms-rest-azure-env": "^2.0.0",
"@azure/ms-rest-js": "^2.0.4",
"adal-node": "^0.2.2"
}
},
"node_modules/@azure/msal-browser": { "node_modules/@azure/msal-browser": {
"version": "2.14.2", "version": "2.14.2",
"license": "MIT", "license": "MIT",
@@ -589,13 +597,24 @@
} }
}, },
"node_modules/@azure/msal-node": { "node_modules/@azure/msal-node": {
"version": "1.0.0-beta.6", "version": "2.16.2",
"license": "MIT", "resolved": "https://registry.npmjs.org/@azure/msal-node/-/msal-node-2.16.2.tgz",
"integrity": "sha512-An7l1hEr0w1HMMh1LU+rtDtqL7/jw74ORlc9Wnh06v7TU/xpG39/Zdr1ZJu3QpjUfKJ+E0/OXMW8DRSWTlh7qQ==",
"dependencies": { "dependencies": {
"@azure/msal-common": "^4.0.0", "@azure/msal-common": "14.16.0",
"axios": "^0.21.1", "jsonwebtoken": "^9.0.0",
"jsonwebtoken": "^8.5.1",
"uuid": "^8.3.0" "uuid": "^8.3.0"
},
"engines": {
"node": ">=16"
}
},
"node_modules/@azure/msal-node/node_modules/@azure/msal-common": {
"version": "14.16.0",
"resolved": "https://registry.npmjs.org/@azure/msal-common/-/msal-common-14.16.0.tgz",
"integrity": "sha512-1KOZj9IpcDSwpNiQNjt0jDYZpQvNZay7QAEi/5DLubay40iGYtLzya/jbjRPLyOTZhEKyL1MzPuw2HqBCjceYA==",
"engines": {
"node": ">=0.8.0"
} }
}, },
"node_modules/@babel/code-frame": { "node_modules/@babel/code-frame": {
@@ -10073,13 +10092,6 @@
"@octokit/openapi-types": "^19.0.2" "@octokit/openapi-types": "^19.0.2"
} }
}, },
"node_modules/@opentelemetry/api": {
"version": "1.8.0",
"license": "Apache-2.0",
"engines": {
"node": ">=8.0.0"
}
},
"node_modules/@phosphor/algorithm": { "node_modules/@phosphor/algorithm": {
"version": "1.2.0", "version": "1.2.0",
"license": "BSD-3-Clause" "license": "BSD-3-Clause"
@@ -12725,13 +12737,6 @@
"devOptional": true, "devOptional": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/@types/stoppable": {
"version": "1.1.3",
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@types/styled-components": { "node_modules/@types/styled-components": {
"version": "5.1.1", "version": "5.1.1",
"dev": true, "dev": true,
@@ -13234,6 +13239,11 @@
"node": ">=10.0.0" "node": ">=10.0.0"
} }
}, },
"node_modules/@xterm/xterm": {
"version": "5.5.0",
"resolved": "https://registry.npmjs.org/@xterm/xterm/-/xterm-5.5.0.tgz",
"integrity": "sha512-hqJHYaQb5OptNunnyAnkHyM8aCjZ1MEIDTQu1iIbbTD/xops91NB5yq1ZK/dC2JDbVWtF23zUtl9JE2NqwT87A=="
},
"node_modules/@xtuc/ieee754": { "node_modules/@xtuc/ieee754": {
"version": "1.2.0", "version": "1.2.0",
"license": "BSD-3-Clause" "license": "BSD-3-Clause"
@@ -13334,61 +13344,6 @@
"node": ">=0.4.0" "node": ">=0.4.0"
} }
}, },
"node_modules/adal-node": {
"version": "0.2.4",
"license": "Apache-2.0",
"dependencies": {
"@xmldom/xmldom": "^0.8.3",
"async": "^2.6.3",
"axios": "^0.21.1",
"date-utils": "*",
"jws": "3.x.x",
"underscore": ">= 1.3.1",
"uuid": "^3.1.0",
"xpath.js": "~1.1.0"
},
"engines": {
"node": ">= 0.6.15"
}
},
"node_modules/adal-node/node_modules/@xmldom/xmldom": {
"version": "0.8.10",
"license": "MIT",
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/adal-node/node_modules/async": {
"version": "2.6.4",
"license": "MIT",
"dependencies": {
"lodash": "^4.17.14"
}
},
"node_modules/adal-node/node_modules/jwa": {
"version": "1.4.1",
"license": "MIT",
"dependencies": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"node_modules/adal-node/node_modules/jws": {
"version": "3.2.2",
"license": "MIT",
"dependencies": {
"jwa": "^1.4.1",
"safe-buffer": "^5.0.1"
}
},
"node_modules/adal-node/node_modules/uuid": {
"version": "3.4.0",
"license": "MIT",
"bin": {
"uuid": "bin/uuid"
}
},
"node_modules/address": { "node_modules/address": {
"version": "1.1.2", "version": "1.1.2",
"dev": true, "dev": true,
@@ -14021,13 +13976,6 @@
"version": "1.12.0", "version": "1.12.0",
"license": "MIT" "license": "MIT"
}, },
"node_modules/axios": {
"version": "0.21.4",
"license": "MIT",
"dependencies": {
"follow-redirects": "^1.14.0"
}
},
"node_modules/babel-core": { "node_modules/babel-core": {
"version": "7.0.0-bridge.0", "version": "7.0.0-bridge.0",
"dev": true, "dev": true,
@@ -14731,7 +14679,7 @@
}, },
"node_modules/base64-js": { "node_modules/base64-js": {
"version": "1.5.1", "version": "1.5.1",
"devOptional": true, "dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -14803,8 +14751,9 @@
}, },
"node_modules/bl": { "node_modules/bl": {
"version": "4.1.0", "version": "4.1.0",
"devOptional": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"buffer": "^5.5.0", "buffer": "^5.5.0",
"inherits": "^2.0.4", "inherits": "^2.0.4",
@@ -14813,7 +14762,7 @@
}, },
"node_modules/bl/node_modules/buffer": { "node_modules/bl/node_modules/buffer": {
"version": "5.7.1", "version": "5.7.1",
"devOptional": true, "dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -14829,6 +14778,7 @@
} }
], ],
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"base64-js": "^1.3.1", "base64-js": "^1.3.1",
"ieee754": "^1.1.13" "ieee754": "^1.1.13"
@@ -14836,8 +14786,9 @@
}, },
"node_modules/bl/node_modules/readable-stream": { "node_modules/bl/node_modules/readable-stream": {
"version": "3.6.2", "version": "3.6.2",
"devOptional": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true,
"dependencies": { "dependencies": {
"inherits": "^2.0.3", "inherits": "^2.0.3",
"string_decoder": "^1.1.1", "string_decoder": "^1.1.1",
@@ -15407,11 +15358,6 @@
"node": ">=8.0" "node": ">=8.0"
} }
}, },
"node_modules/chownr": {
"version": "1.1.4",
"license": "ISC",
"optional": true
},
"node_modules/chrome-trace-event": { "node_modules/chrome-trace-event": {
"version": "1.0.3", "version": "1.0.3",
"license": "MIT", "license": "MIT",
@@ -17175,13 +17121,6 @@
"version": "1.29.0", "version": "1.29.0",
"license": "MIT" "license": "MIT"
}, },
"node_modules/date-utils": {
"version": "1.2.21",
"license": "MIT",
"engines": {
"node": ">0.4.0"
}
},
"node_modules/dayjs": { "node_modules/dayjs": {
"version": "1.8.19", "version": "1.8.19",
"license": "MIT" "license": "MIT"
@@ -17276,14 +17215,6 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/deep-extend": {
"version": "0.6.0",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=4.0.0"
}
},
"node_modules/deep-is": { "node_modules/deep-is": {
"version": "0.1.4", "version": "0.1.4",
"license": "MIT" "license": "MIT"
@@ -17344,7 +17275,6 @@
}, },
"node_modules/define-lazy-prop": { "node_modules/define-lazy-prop": {
"version": "2.0.0", "version": "2.0.0",
"dev": true,
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=8" "node": ">=8"
@@ -18976,14 +18906,6 @@
"version": "2.0.0", "version": "2.0.0",
"license": "MIT" "license": "MIT"
}, },
"node_modules/expand-template": {
"version": "2.0.3",
"license": "(MIT OR WTFPL)",
"optional": true,
"engines": {
"node": ">=6"
}
},
"node_modules/expect": { "node_modules/expect": {
"version": "29.7.0", "version": "29.7.0",
"resolved": "https://registry.npmjs.org/expect/-/expect-29.7.0.tgz", "resolved": "https://registry.npmjs.org/expect/-/expect-29.7.0.tgz",
@@ -19671,6 +19593,7 @@
}, },
"node_modules/follow-redirects": { "node_modules/follow-redirects": {
"version": "1.15.3", "version": "1.15.3",
"dev": true,
"funding": [ "funding": [
{ {
"type": "individual", "type": "individual",
@@ -19978,11 +19901,6 @@
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/fs-constants": {
"version": "1.0.0",
"license": "MIT",
"optional": true
},
"node_modules/fs-extra": { "node_modules/fs-extra": {
"version": "7.0.0", "version": "7.0.0",
"dev": true, "dev": true,
@@ -20197,11 +20115,6 @@
"assert-plus": "^1.0.0" "assert-plus": "^1.0.0"
} }
}, },
"node_modules/github-from-package": {
"version": "0.0.0",
"license": "MIT",
"optional": true
},
"node_modules/glob": { "node_modules/glob": {
"version": "7.2.3", "version": "7.2.3",
"license": "ISC", "license": "ISC",
@@ -21333,7 +21246,7 @@
}, },
"node_modules/ieee754": { "node_modules/ieee754": {
"version": "1.2.1", "version": "1.2.1",
"devOptional": true, "dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -21522,7 +21435,7 @@
}, },
"node_modules/ini": { "node_modules/ini": {
"version": "1.3.8", "version": "1.3.8",
"devOptional": true, "dev": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/internal-slot": { "node_modules/internal-slot": {
@@ -27558,8 +27471,9 @@
} }
}, },
"node_modules/jsonwebtoken": { "node_modules/jsonwebtoken": {
"version": "8.5.1", "version": "9.0.2",
"license": "MIT", "resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-PRp66vJ865SSqOlgqS8hujT5U4AOgMfhrwYIuIhfKaoSCZcirrmASQr8CX7cUg+RMih+hgznrjp99o+W4pJLHQ==",
"dependencies": { "dependencies": {
"jws": "^3.2.2", "jws": "^3.2.2",
"lodash.includes": "^4.3.0", "lodash.includes": "^4.3.0",
@@ -27570,16 +27484,17 @@
"lodash.isstring": "^4.0.1", "lodash.isstring": "^4.0.1",
"lodash.once": "^4.0.0", "lodash.once": "^4.0.0",
"ms": "^2.1.1", "ms": "^2.1.1",
"semver": "^5.6.0" "semver": "^7.5.4"
}, },
"engines": { "engines": {
"node": ">=4", "node": ">=12",
"npm": ">=1.4.28" "npm": ">=6"
} }
}, },
"node_modules/jsonwebtoken/node_modules/jwa": { "node_modules/jsonwebtoken/node_modules/jwa": {
"version": "1.4.1", "version": "1.4.1",
"license": "MIT", "resolved": "https://registry.npmjs.org/jwa/-/jwa-1.4.1.tgz",
"integrity": "sha512-qiLX/xhEEFKUAJ6FiBMbes3w9ATzyk5W7Hvzpa/SLYdxNtng+gcurvrI7TbACjIXlsJyr05/S1oUhZrc63evQA==",
"dependencies": { "dependencies": {
"buffer-equal-constant-time": "1.0.1", "buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11", "ecdsa-sig-formatter": "1.0.11",
@@ -27588,12 +27503,24 @@
}, },
"node_modules/jsonwebtoken/node_modules/jws": { "node_modules/jsonwebtoken/node_modules/jws": {
"version": "3.2.2", "version": "3.2.2",
"license": "MIT", "resolved": "https://registry.npmjs.org/jws/-/jws-3.2.2.tgz",
"integrity": "sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA==",
"dependencies": { "dependencies": {
"jwa": "^1.4.1", "jwa": "^1.4.1",
"safe-buffer": "^5.0.1" "safe-buffer": "^5.0.1"
} }
}, },
"node_modules/jsonwebtoken/node_modules/semver": {
"version": "7.6.3",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.6.3.tgz",
"integrity": "sha512-oVekP1cKtI+CTDvHWYFUcMtsK/00wmAEfyqKfNdARm8u1wNVhSgaX7A8d4UuIlUI5e84iEwOhs7ZPYRmzU9U6A==",
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/jsprim": { "node_modules/jsprim": {
"version": "1.4.2", "version": "1.4.2",
"license": "MIT", "license": "MIT",
@@ -27646,16 +27573,6 @@
"version": "2.6.0", "version": "2.6.0",
"license": "MIT" "license": "MIT"
}, },
"node_modules/keytar": {
"version": "7.9.0",
"hasInstallScript": true,
"license": "MIT",
"optional": true,
"dependencies": {
"node-addon-api": "^4.3.0",
"prebuild-install": "^7.0.1"
}
},
"node_modules/keyv": { "node_modules/keyv": {
"version": "4.5.4", "version": "4.5.4",
"license": "MIT", "license": "MIT",
@@ -27972,7 +27889,8 @@
}, },
"node_modules/lodash.includes": { "node_modules/lodash.includes": {
"version": "4.3.0", "version": "4.3.0",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.includes/-/lodash.includes-4.3.0.tgz",
"integrity": "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="
}, },
"node_modules/lodash.invokemap": { "node_modules/lodash.invokemap": {
"version": "4.6.0", "version": "4.6.0",
@@ -27981,7 +27899,8 @@
}, },
"node_modules/lodash.isboolean": { "node_modules/lodash.isboolean": {
"version": "3.0.3", "version": "3.0.3",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz",
"integrity": "sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg=="
}, },
"node_modules/lodash.isequal": { "node_modules/lodash.isequal": {
"version": "4.5.0", "version": "4.5.0",
@@ -27989,19 +27908,23 @@
}, },
"node_modules/lodash.isinteger": { "node_modules/lodash.isinteger": {
"version": "4.0.4", "version": "4.0.4",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.isinteger/-/lodash.isinteger-4.0.4.tgz",
"integrity": "sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA=="
}, },
"node_modules/lodash.isnumber": { "node_modules/lodash.isnumber": {
"version": "3.0.3", "version": "3.0.3",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.isnumber/-/lodash.isnumber-3.0.3.tgz",
"integrity": "sha512-QYqzpfwO3/CWf3XP+Z+tkQsfaLL/EnUlXWVkIk5FUPc4sBdTehEqZONuyRt2P67PXAk+NXmTBcc97zw9t1FQrw=="
}, },
"node_modules/lodash.isplainobject": { "node_modules/lodash.isplainobject": {
"version": "4.0.6", "version": "4.0.6",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA=="
}, },
"node_modules/lodash.isstring": { "node_modules/lodash.isstring": {
"version": "4.0.1", "version": "4.0.1",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.isstring/-/lodash.isstring-4.0.1.tgz",
"integrity": "sha512-0wJxfxH1wgO3GrbuP+dTTk7op+6L41QCXbGINEmD+ny/G/eCqGzxyCsh7159S+mgDDcoarnBw6PC1PS5+wUGgw=="
}, },
"node_modules/lodash.memoize": { "node_modules/lodash.memoize": {
"version": "4.1.2", "version": "4.1.2",
@@ -28013,7 +27936,8 @@
}, },
"node_modules/lodash.once": { "node_modules/lodash.once": {
"version": "4.1.1", "version": "4.1.1",
"license": "MIT" "resolved": "https://registry.npmjs.org/lodash.once/-/lodash.once-4.1.1.tgz",
"integrity": "sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg=="
}, },
"node_modules/lodash.pullall": { "node_modules/lodash.pullall": {
"version": "4.2.0", "version": "4.2.0",
@@ -29526,11 +29450,6 @@
"node": ">=10" "node": ">=10"
} }
}, },
"node_modules/mkdirp-classic": {
"version": "0.5.3",
"license": "MIT",
"optional": true
},
"node_modules/moment": { "node_modules/moment": {
"version": "2.29.4", "version": "2.29.4",
"license": "MIT", "license": "MIT",
@@ -29597,16 +29516,6 @@
"version": "2.1.3", "version": "2.1.3",
"license": "MIT" "license": "MIT"
}, },
"node_modules/msal": {
"version": "1.4.18",
"license": "MIT",
"dependencies": {
"tslib": "^1.9.3"
},
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/multicast-dns": { "node_modules/multicast-dns": {
"version": "7.2.5", "version": "7.2.5",
"dev": true, "dev": true,
@@ -29667,11 +29576,6 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/napi-build-utils": {
"version": "1.0.2",
"license": "MIT",
"optional": true
},
"node_modules/native-promise-only": { "node_modules/native-promise-only": {
"version": "0.8.1", "version": "0.8.1",
"dev": true, "dev": true,
@@ -29756,58 +29660,12 @@
"node": ">=12.0.0" "node": ">=12.0.0"
} }
}, },
"node_modules/node-abi": {
"version": "3.60.0",
"license": "MIT",
"optional": true,
"dependencies": {
"semver": "^7.3.5"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/lru-cache": {
"version": "6.0.0",
"license": "ISC",
"optional": true,
"dependencies": {
"yallist": "^4.0.0"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/semver": {
"version": "7.6.0",
"license": "ISC",
"optional": true,
"dependencies": {
"lru-cache": "^6.0.0"
},
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/yallist": {
"version": "4.0.0",
"license": "ISC",
"optional": true
},
"node_modules/node-abort-controller": { "node_modules/node-abort-controller": {
"version": "3.1.1", "version": "3.1.1",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"peer": true "peer": true
}, },
"node_modules/node-addon-api": {
"version": "4.3.0",
"license": "MIT",
"optional": true
},
"node_modules/node-dir": { "node_modules/node-dir": {
"version": "0.1.17", "version": "0.1.17",
"dev": true, "dev": true,
@@ -31006,80 +30864,6 @@
"version": "4.2.0", "version": "4.2.0",
"license": "MIT" "license": "MIT"
}, },
"node_modules/prebuild-install": {
"version": "7.1.2",
"license": "MIT",
"optional": true,
"dependencies": {
"detect-libc": "^2.0.0",
"expand-template": "^2.0.3",
"github-from-package": "0.0.0",
"minimist": "^1.2.3",
"mkdirp-classic": "^0.5.3",
"napi-build-utils": "^1.0.1",
"node-abi": "^3.3.0",
"pump": "^3.0.0",
"rc": "^1.2.7",
"simple-get": "^4.0.0",
"tar-fs": "^2.0.0",
"tunnel-agent": "^0.6.0"
},
"bin": {
"prebuild-install": "bin.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/prebuild-install/node_modules/decompress-response": {
"version": "6.0.0",
"license": "MIT",
"optional": true,
"dependencies": {
"mimic-response": "^3.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/prebuild-install/node_modules/mimic-response": {
"version": "3.1.0",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/prebuild-install/node_modules/simple-get": {
"version": "4.0.1",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT",
"optional": true,
"dependencies": {
"decompress-response": "^6.0.0",
"once": "^1.3.1",
"simple-concat": "^1.0.0"
}
},
"node_modules/prelude-ls": { "node_modules/prelude-ls": {
"version": "1.2.1", "version": "1.2.1",
"license": "MIT", "license": "MIT",
@@ -31509,28 +31293,6 @@
"version": "0.5.1", "version": "0.5.1",
"dev": true "dev": true
}, },
"node_modules/rc": {
"version": "1.2.8",
"license": "(BSD-2-Clause OR MIT OR Apache-2.0)",
"optional": true,
"dependencies": {
"deep-extend": "^0.6.0",
"ini": "~1.3.0",
"minimist": "^1.2.0",
"strip-json-comments": "~2.0.1"
},
"bin": {
"rc": "cli.js"
}
},
"node_modules/rc/node_modules/strip-json-comments": {
"version": "2.0.1",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/re-resizable": { "node_modules/re-resizable": {
"version": "6.9.11", "version": "6.9.11",
"license": "MIT", "license": "MIT",
@@ -34439,45 +34201,6 @@
"node": ">=10" "node": ">=10"
} }
}, },
"node_modules/tar-fs": {
"version": "2.1.1",
"license": "MIT",
"optional": true,
"dependencies": {
"chownr": "^1.1.1",
"mkdirp-classic": "^0.5.2",
"pump": "^3.0.0",
"tar-stream": "^2.1.4"
}
},
"node_modules/tar-stream": {
"version": "2.2.0",
"license": "MIT",
"optional": true,
"dependencies": {
"bl": "^4.0.3",
"end-of-stream": "^1.4.1",
"fs-constants": "^1.0.0",
"inherits": "^2.0.3",
"readable-stream": "^3.1.1"
},
"engines": {
"node": ">=6"
}
},
"node_modules/tar-stream/node_modules/readable-stream": {
"version": "3.6.2",
"license": "MIT",
"optional": true,
"dependencies": {
"inherits": "^2.0.3",
"string_decoder": "^1.1.1",
"util-deprecate": "^1.0.1"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/tar/node_modules/chownr": { "node_modules/tar/node_modules/chownr": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/chownr/-/chownr-2.0.0.tgz", "resolved": "https://registry.npmjs.org/chownr/-/chownr-2.0.0.tgz",
@@ -36815,13 +36538,6 @@
"integrity": "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw==", "integrity": "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw==",
"dev": true "dev": true
}, },
"node_modules/xpath.js": {
"version": "1.1.0",
"license": "MIT",
"engines": {
"node": ">=0.4.0"
}
},
"node_modules/xtend": { "node_modules/xtend": {
"version": "4.0.2", "version": "4.0.2",
"license": "MIT", "license": "MIT",

View File

@@ -7,15 +7,14 @@
"@azure/arm-cosmosdb": "9.1.0", "@azure/arm-cosmosdb": "9.1.0",
"@azure/cosmos": "4.2.0-beta.1", "@azure/cosmos": "4.2.0-beta.1",
"@azure/cosmos-language-service": "0.0.5", "@azure/cosmos-language-service": "0.0.5",
"@azure/identity": "1.5.2", "@azure/identity": "4.5.0",
"@azure/ms-rest-nodeauth": "3.1.1",
"@azure/msal-browser": "2.14.2", "@azure/msal-browser": "2.14.2",
"@babel/plugin-proposal-class-properties": "7.12.1", "@babel/plugin-proposal-class-properties": "7.12.1",
"@babel/plugin-proposal-decorators": "7.12.12", "@babel/plugin-proposal-decorators": "7.12.12",
"@fluentui/react": "8.119.0", "@fluentui/react": "8.119.0",
"@fluentui/react-components": "9.54.2", "@fluentui/react-components": "9.54.2",
"@jupyterlab/services": "6.0.2",
"@jupyterlab/terminal": "3.0.3", "@jupyterlab/terminal": "3.0.3",
"@jupyterlab/services": "6.0.2",
"@microsoft/applicationinsights-web": "2.6.1", "@microsoft/applicationinsights-web": "2.6.1",
"@nteract/commutable": "7.5.1", "@nteract/commutable": "7.5.1",
"@nteract/connected-components": "6.8.2", "@nteract/connected-components": "6.8.2",
@@ -47,6 +46,7 @@
"@types/mkdirp": "1.0.1", "@types/mkdirp": "1.0.1",
"@types/node-fetch": "2.5.7", "@types/node-fetch": "2.5.7",
"@xmldom/xmldom": "0.7.13", "@xmldom/xmldom": "0.7.13",
"@xterm/xterm": "5.5.0",
"allotment": "1.20.2", "allotment": "1.20.2",
"applicationinsights": "1.8.0", "applicationinsights": "1.8.0",
"bootstrap": "3.4.1", "bootstrap": "3.4.1",

View File

@@ -1,7 +1,7 @@
[defaults] [defaults]
group = stfaul group = dataexplorer-preview
sku = P1v2 sku = P1V2
appserviceplan = stfaul_asp_Linux_centralus_0 appserviceplan = dataexplorer-preview
location = centralus location = westus2
web = cosmos-explorer-preview web = dataexplorer-preview

View File

@@ -4,8 +4,8 @@ Cosmos Explorer Preview makes it possible to try a working version of any commit
Initial support is for Hosted (Connection string only) or the Azure Portal. Examples: Initial support is for Hosted (Connection string only) or the Azure Portal. Examples:
Connection string URLs: https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html Connection string URLs: https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html
Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home
In both cases replace `COMMIT_SHA` with the commit you want to view. It must have already completed its build on GitHub Actions. In both cases replace `COMMIT_SHA` with the commit you want to view. It must have already completed its build on GitHub Actions.

View File

@@ -1,4 +1,4 @@
{ {
"PROXY_PATH": "/proxy", "PROXY_PATH": "/proxy",
"msalRedirectURI": "https://cosmos-explorer-preview.azurewebsites.net/" "msalRedirectURI": "https://dataexplorer-preview.azurewebsites.net/"
} }

View File

@@ -3,8 +3,15 @@ const { createProxyMiddleware } = require("http-proxy-middleware");
const port = process.env.PORT || 3000; const port = process.env.PORT || 3000;
const fetch = require("node-fetch"); const fetch = require("node-fetch");
const api = createProxyMiddleware("/api", { const backendEndpoint = "https://cdb-ms-mpac-pbe.cosmos.azure.com";
target: "https://cdb-ms-mpac-pbe.cosmos.azure.com", const previewSiteEndpoint = "https://dataexplorer-preview.azurewebsites.net";
const previewStorageWebsiteEndpoint = "https://dataexplorerpreview.z5.web.core.windows.net/";
const githubApiUrl = "https://api.github.com/repos/Azure/cosmos-explorer";
const githubPullRequestUrl = "https://github.com/Azure/cosmos-explorer/pull";
const azurePortalMpacEndpoint = "https://ms.portal.azure.com/";
const api = createProxyMiddleware({
target: backendEndpoint,
changeOrigin: true, changeOrigin: true,
logLevel: "debug", logLevel: "debug",
bypass: (req, res) => { bypass: (req, res) => {
@@ -15,8 +22,8 @@ const api = createProxyMiddleware("/api", {
}, },
}); });
const proxy = createProxyMiddleware("/proxy", { const proxy = createProxyMiddleware({
target: "https://cdb-ms-mpac-pbe.cosmos.azure.com", target: backendEndpoint,
changeOrigin: true, changeOrigin: true,
secure: false, secure: false,
logLevel: "debug", logLevel: "debug",
@@ -27,35 +34,38 @@ const proxy = createProxyMiddleware("/proxy", {
}, },
}); });
const commit = createProxyMiddleware("/commit", { const commit = createProxyMiddleware({
target: "https://cosmosexplorerpreview.blob.core.windows.net", target: previewStorageWebsiteEndpoint,
changeOrigin: true, changeOrigin: true,
secure: false, secure: false,
logLevel: "debug", logLevel: "debug",
pathRewrite: { "^/commit": "$web/" }, pathRewrite: { "^/commit": "/" },
}); });
const app = express(); const app = express();
app.use(api); app.use("/api", api);
app.use(proxy); app.use("/proxy", proxy);
app.use(commit); app.use("/commit", commit);
app.get("/pull/:pr(\\d+)", (req, res) => { app.get("/pull/:pr(\\d+)", (req, res) => {
const pr = req.params.pr; const pr = req.params.pr;
if (!/^\d+$/.test(pr)) {
return res.status(400).send("Invalid pull request number");
}
const [, query] = req.originalUrl.split("?"); const [, query] = req.originalUrl.split("?");
const search = new URLSearchParams(query); const search = new URLSearchParams(query);
fetch("https://api.github.com/repos/Azure/cosmos-explorer/pulls/" + pr) fetch(`${githubApiUrl}/pulls/${pr}`)
.then((response) => response.json()) .then((response) => response.json())
.then(({ head: { ref, sha } }) => { .then(({ head: { ref, sha } }) => {
const prUrl = new URL("https://github.com/Azure/cosmos-explorer/pull/" + pr); const prUrl = new URL(`${githubPullRequestUrl}/${pr}`);
prUrl.hash = ref; prUrl.hash = ref;
search.set("feature.pr", prUrl.href); search.set("feature.pr", prUrl.href);
const explorer = new URL("https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/explorer.html"); const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/explorer.html`);
explorer.search = search.toString(); explorer.search = search.toString();
const portal = new URL("https://ms.portal.azure.com/"); const portal = new URL(azurePortalMpacEndpoint);
portal.searchParams.set("dataExplorerSource", explorer.href); portal.searchParams.set("dataExplorerSource", explorer.href);
return res.redirect(portal.href); return res.redirect(portal.href);
@@ -63,12 +73,10 @@ app.get("/pull/:pr(\\d+)", (req, res) => {
.catch(() => res.sendStatus(500)); .catch(() => res.sendStatus(500));
}); });
app.get("/", (req, res) => { app.get("/", (req, res) => {
fetch("https://api.github.com/repos/Azure/cosmos-explorer/branches/master") fetch(`${githubApiUrl}/branches/master`)
.then((response) => response.json()) .then((response) => response.json())
.then(({ commit: { sha } }) => { .then(({ commit: { sha } }) => {
const explorer = new URL( const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/hostedExplorer.html`);
"https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/hostedExplorer.html"
);
return res.redirect(explorer.href); return res.redirect(explorer.href);
}) })
.catch(() => res.sendStatus(500)); .catch(() => res.sendStatus(500));

1360
preview/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@
"description": "", "description": "",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {
"deploy": "az webapp up --name \"cosmos-explorer-preview\" --subscription \"cosmosdb-portalteam-generaltest-msft\" --resource-group \"stfaul\"", "deploy": "az webapp up --name \"dataexplorer-preview\" --subscription \"cosmosdb-portalteam-runners\" --resource-group \"dataexplorer-preview\" --runtime \"NODE:18-lts\" --sku P1V2",
"start": "node index.js", "start": "node index.js",
"test": "echo \"Error: no test specified\" && exit 1" "test": "echo \"Error: no test specified\" && exit 1"
}, },
@@ -12,7 +12,8 @@
"author": "Microsoft Corporation", "author": "Microsoft Corporation",
"dependencies": { "dependencies": {
"express": "^4.17.1", "express": "^4.17.1",
"http-proxy-middleware": "^1.1.0", "http-proxy-middleware": "^3.0.3",
"node": "^18.20.6",
"node-fetch": "^2.6.1" "node-fetch": "^2.6.1"
} }
} }

View File

@@ -97,6 +97,12 @@ export enum CapacityMode {
Serverless = "Serverless", Serverless = "Serverless",
} }
export enum WorkloadType {
Learning = "Learning",
DevelopmentTesting = "Development/Testing",
Production = "Production",
None = "None",
}
// flight names returned from the portal are always lowercase // flight names returned from the portal are always lowercase
export class Flights { export class Flights {
public static readonly SettingsV2 = "settingsv2"; public static readonly SettingsV2 = "settingsv2";
@@ -119,6 +125,7 @@ export class AfecFeatures {
export class TagNames { export class TagNames {
public static defaultExperience: string = "defaultExperience"; public static defaultExperience: string = "defaultExperience";
public static WorkloadType: string = "hidden-workload-type";
} }
export class MongoDBAccounts { export class MongoDBAccounts {
@@ -518,6 +525,11 @@ export class PriorityLevel {
public static readonly Default = "low"; public static readonly Default = "low";
} }
export class ariaLabelForLearnMoreLink {
public static readonly AnalyticalStore = "Learn more about analytical store.";
public static readonly AzureSynapseLink = "Learn more about Azure Synapse Link.";
}
export const QueryCopilotSampleDatabaseId = "CopilotSampleDB"; export const QueryCopilotSampleDatabaseId = "CopilotSampleDB";
export const QueryCopilotSampleContainerId = "SampleContainer"; export const QueryCopilotSampleContainerId = "SampleContainer";

View File

@@ -3,9 +3,8 @@ import { getAuthorizationTokenUsingResourceTokens } from "Common/getAuthorizatio
import { AuthorizationToken } from "Contracts/FabricMessageTypes"; import { AuthorizationToken } from "Contracts/FabricMessageTypes";
import { checkDatabaseResourceTokensValidity } from "Platform/Fabric/FabricUtil"; import { checkDatabaseResourceTokensValidity } from "Platform/Fabric/FabricUtil";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility"; import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { AuthType } from "../AuthType"; import { AuthType } from "../AuthType";
import { BackendApi, PriorityLevel } from "../Common/Constants"; import { PriorityLevel } from "../Common/Constants";
import * as Logger from "../Common/Logger"; import * as Logger from "../Common/Logger";
import { Platform, configContext } from "../ConfigContext"; import { Platform, configContext } from "../ConfigContext";
import { updateUserContext, userContext } from "../UserContext"; import { updateUserContext, userContext } from "../UserContext";
@@ -125,10 +124,6 @@ export async function getTokenFromAuthService(
resourceType: string, resourceType: string,
resourceId?: string, resourceId?: string,
): Promise<AuthorizationToken> { ): Promise<AuthorizationToken> {
if (!useNewPortalBackendEndpoint(BackendApi.RuntimeProxy)) {
return getTokenFromAuthService_ToBeDeprecated(verb, resourceType, resourceId);
}
try { try {
const host: string = configContext.PORTAL_BACKEND_ENDPOINT; const host: string = configContext.PORTAL_BACKEND_ENDPOINT;
const response: Response = await _global.fetch(host + "/api/connectionstring/runtimeproxy/authorizationtokens", { const response: Response = await _global.fetch(host + "/api/connectionstring/runtimeproxy/authorizationtokens", {
@@ -151,34 +146,6 @@ export async function getTokenFromAuthService(
} }
} }
export async function getTokenFromAuthService_ToBeDeprecated(
verb: string,
resourceType: string,
resourceId?: string,
): Promise<AuthorizationToken> {
try {
const host = configContext.BACKEND_ENDPOINT;
const response = await _global.fetch(host + "/api/guest/runtimeproxy/authorizationTokens", {
method: "POST",
headers: {
"content-type": "application/json",
"x-ms-encrypted-auth-token": userContext.accessToken,
},
body: JSON.stringify({
verb,
resourceType,
resourceId,
}),
});
//TODO I am not sure why we have to parse the JSON again here. fetch should do it for us when we call .json()
const result = JSON.parse(await response.json());
return result;
} catch (error) {
logConsoleError(`Failed to get authorization headers for ${resourceType}: ${getErrorMessage(error)}`);
return Promise.reject(error);
}
}
// The Capability is a bitmap, which cosmosdb backend decodes as per the below enum // The Capability is a bitmap, which cosmosdb backend decodes as per the below enum
enum SDKSupportedCapabilities { enum SDKSupportedCapabilities {
None = 0, None = 0,
@@ -203,8 +170,10 @@ export function client(): Cosmos.CosmosClient {
} }
let _defaultHeaders: Cosmos.CosmosHeaders = {}; let _defaultHeaders: Cosmos.CosmosHeaders = {};
_defaultHeaders["x-ms-cosmos-sdk-supportedcapabilities"] = _defaultHeaders["x-ms-cosmos-sdk-supportedcapabilities"] =
SDKSupportedCapabilities.None | SDKSupportedCapabilities.PartitionMerge; SDKSupportedCapabilities.None | SDKSupportedCapabilities.PartitionMerge;
_defaultHeaders["x-ms-cosmos-throughput-bucket"] = 1;
if ( if (
userContext.authType === AuthType.ConnectionString || userContext.authType === AuthType.ConnectionString ||

View File

@@ -0,0 +1,34 @@
import { WorkloadType } from "Common/Constants";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { DatabaseAccount, Tags } from "Contracts/DataModels";
import { updateUserContext } from "UserContext";
describe("Database Account Utility", () => {
describe("Workload Type", () => {
beforeEach(() => {
updateUserContext({
databaseAccount: {
tags: {} as Tags,
} as DatabaseAccount,
});
});
it("Workload Type should return Learning", () => {
updateUserContext({
databaseAccount: {
tags: {
"hidden-workload-type": WorkloadType.Learning,
} as Tags,
} as DatabaseAccount,
});
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.Learning);
});
it("Workload Type should return None", () => {
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.None);
});
});
});

View File

@@ -1,3 +1,5 @@
import { TagNames, WorkloadType } from "Common/Constants";
import { Tags } from "Contracts/DataModels";
import { userContext } from "../UserContext"; import { userContext } from "../UserContext";
function isVirtualNetworkFilterEnabled() { function isVirtualNetworkFilterEnabled() {
@@ -15,3 +17,12 @@ function isPrivateEndpointConnectionsEnabled() {
export function isPublicInternetAccessAllowed(): boolean { export function isPublicInternetAccessAllowed(): boolean {
return !isVirtualNetworkFilterEnabled() && !isIpRulesEnabled() && !isPrivateEndpointConnectionsEnabled(); return !isVirtualNetworkFilterEnabled() && !isIpRulesEnabled() && !isPrivateEndpointConnectionsEnabled();
} }
export function getWorkloadType(): WorkloadType {
const tags: Tags = userContext?.databaseAccount?.tags;
const workloadType: WorkloadType = tags && (tags[TagNames.WorkloadType] as WorkloadType);
if (!workloadType) {
return WorkloadType.None;
}
return workloadType;
}

View File

@@ -1,4 +1,4 @@
// import { QueryOperationOptions } from "@azure/cosmos"; import { QueryOperationOptions } from "@azure/cosmos";
import { QueryResults } from "../Contracts/ViewModels"; import { QueryResults } from "../Contracts/ViewModels";
interface QueryResponse { interface QueryResponse {
@@ -11,13 +11,17 @@ interface QueryResponse {
} }
export interface MinimalQueryIterator { export interface MinimalQueryIterator {
fetchNext: () => Promise<QueryResponse>; fetchNext: (queryOperationOptions?: QueryOperationOptions) => Promise<QueryResponse>;
} }
// Pick<QueryIterator<any>, "fetchNext">; // Pick<QueryIterator<any>, "fetchNext">;
export function nextPage(documentsIterator: MinimalQueryIterator, firstItemIndex: number): Promise<QueryResults> { export function nextPage(
return documentsIterator.fetchNext().then((response) => { documentsIterator: MinimalQueryIterator,
firstItemIndex: number,
queryOperationOptions?: QueryOperationOptions,
): Promise<QueryResults> {
return documentsIterator.fetchNext(queryOperationOptions).then((response) => {
const documents = response.resources; const documents = response.resources;
// eslint-disable-next-line @typescript-eslint/no-explicit-any // eslint-disable-next-line @typescript-eslint/no-explicit-any
const headers = (response as any).headers || {}; // TODO this is a private key. Remove any const headers = (response as any).headers || {}; // TODO this is a private key. Remove any

View File

@@ -4,16 +4,8 @@ import { configContext, resetConfigContext, updateConfigContext } from "../Confi
import { DatabaseAccount } from "../Contracts/DataModels"; import { DatabaseAccount } from "../Contracts/DataModels";
import { Collection } from "../Contracts/ViewModels"; import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId"; import DocumentId from "../Explorer/Tree/DocumentId";
import { extractFeatures } from "../Platform/Hosted/extractFeatures";
import { updateUserContext } from "../UserContext"; import { updateUserContext } from "../UserContext";
import { import { deleteDocuments, getEndpoint, queryDocuments, readDocument, updateDocument } from "./MongoProxyClient";
deleteDocument,
getEndpoint,
getFeatureEndpointOrDefault,
queryDocuments,
readDocument,
updateDocument,
} from "./MongoProxyClient";
const databaseId = "testDB"; const databaseId = "testDB";
@@ -196,20 +188,8 @@ describe("MongoProxyClient", () => {
expect.any(Object), expect.any(Object),
); );
}); });
it("builds the correct proxy URL in development", () => {
updateConfigContext({
MONGO_BACKEND_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
updateDocument(databaseId, collection, documentId, "{}");
expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
}); });
describe("deleteDocument", () => { describe("deleteDocuments", () => {
beforeEach(() => { beforeEach(() => {
resetConfigContext(); resetConfigContext();
updateUserContext({ updateUserContext({
@@ -226,9 +206,9 @@ describe("MongoProxyClient", () => {
}); });
it("builds the correct URL", () => { it("builds the correct URL", () => {
deleteDocument(databaseId, collection, documentId); deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith( expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`, `${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object), expect.any(Object),
); );
}); });
@@ -238,9 +218,9 @@ describe("MongoProxyClient", () => {
MONGO_PROXY_ENDPOINT: "https://localhost:1234", MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [], globallyEnabledMongoAPIs: [],
}); });
deleteDocument(databaseId, collection, documentId); deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith( expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`, `${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object), expect.any(Object),
); );
}); });
@@ -275,33 +255,4 @@ describe("MongoProxyClient", () => {
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/connectionstring/mongo/explorer`); expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/connectionstring/mongo/explorer`);
}); });
}); });
describe("getFeatureEndpointOrDefault", () => {
beforeEach(() => {
resetConfigContext();
updateConfigContext({
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
const params = new URLSearchParams({
"feature.mongoProxyEndpoint": MongoProxyEndpoints.Prod,
"feature.mongoProxyAPIs": "readDocument|createDocument",
});
const features = extractFeatures(params);
updateUserContext({
authType: AuthType.AAD,
features: features,
});
});
it("returns a local endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("readDocument");
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`);
});
it("returns a production endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("DeleteDocument");
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`);
});
});
}); });

View File

@@ -1,20 +1,13 @@
import { Constants as CosmosSDKConstants } from "@azure/cosmos"; import { Constants as CosmosSDKConstants } from "@azure/cosmos";
import {
allowedMongoProxyEndpoints_ToBeDeprecated,
defaultAllowedMongoProxyEndpoints,
validateEndpoint,
} from "Utils/EndpointUtils";
import queryString from "querystring";
import { AuthType } from "../AuthType"; import { AuthType } from "../AuthType";
import { configContext } from "../ConfigContext"; import { configContext } from "../ConfigContext";
import * as DataModels from "../Contracts/DataModels"; import * as DataModels from "../Contracts/DataModels";
import { MessageTypes } from "../Contracts/ExplorerContracts"; import { MessageTypes } from "../Contracts/ExplorerContracts";
import { Collection } from "../Contracts/ViewModels"; import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId"; import DocumentId from "../Explorer/Tree/DocumentId";
import { hasFlag } from "../Platform/Hosted/extractFeatures";
import { userContext } from "../UserContext"; import { userContext } from "../UserContext";
import { logConsoleError } from "../Utils/NotificationConsoleUtils"; import { logConsoleError } from "../Utils/NotificationConsoleUtils";
import { ApiType, ContentType, HttpHeaders, HttpStatusCodes, MongoProxyApi, MongoProxyEndpoints } from "./Constants"; import { ApiType, ContentType, HttpHeaders, HttpStatusCodes } from "./Constants";
import { MinimalQueryIterator } from "./IteratorUtilities"; import { MinimalQueryIterator } from "./IteratorUtilities";
import { sendMessage } from "./MessageHandler"; import { sendMessage } from "./MessageHandler";
@@ -67,10 +60,6 @@ export function queryDocuments(
query: string, query: string,
continuationToken?: string, continuationToken?: string,
): Promise<QueryResponse> { ): Promise<QueryResponse> {
if (!useMongoProxyEndpoint(MongoProxyApi.ResourceList) || !useMongoProxyEndpoint(MongoProxyApi.QueryDocuments)) {
return queryDocuments_ToBeDeprecated(databaseId, collection, isResourceList, query, continuationToken);
}
const { databaseAccount } = userContext; const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint; const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = { const params = {
@@ -89,7 +78,7 @@ export function queryDocuments(
query, query,
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.ResourceList) || ""; const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT) || "";
const headers = { const headers = {
...defaultHeaders, ...defaultHeaders,
@@ -127,76 +116,11 @@ export function queryDocuments(
}); });
} }
function queryDocuments_ToBeDeprecated(
databaseId: string,
collection: Collection,
isResourceList: boolean,
query: string,
continuationToken?: string,
): Promise<QueryResponse> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
collection && collection.partitionKey && !collection.partitionKey.systemKey
? collection.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("resourcelist") || "";
const headers = {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.IsQuery]: "true",
[CosmosSDKConstants.HttpHeaders.PopulateQueryMetrics]: "true",
[CosmosSDKConstants.HttpHeaders.EnableScanInQuery]: "true",
[CosmosSDKConstants.HttpHeaders.EnableCrossPartitionQuery]: "true",
[CosmosSDKConstants.HttpHeaders.ParallelizeCrossPartitionQuery]: "true",
[HttpHeaders.contentType]: "application/query+json",
};
if (continuationToken) {
headers[CosmosSDKConstants.HttpHeaders.Continuation] = continuationToken;
}
const path = isResourceList ? "/resourcelist" : "";
return window
.fetch(`${endpoint}${path}?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify({ query }),
headers,
})
.then(async (response) => {
if (response.ok) {
return {
continuationToken: response.headers.get(CosmosSDKConstants.HttpHeaders.Continuation),
documents: (await response.json()).Documents as DataModels.DocumentId[],
headers: response.headers,
};
}
await errorHandling(response, "querying documents", params);
return undefined;
});
}
export function readDocument( export function readDocument(
databaseId: string, databaseId: string,
collection: Collection, collection: Collection,
documentId: DocumentId, documentId: DocumentId,
): Promise<DataModels.DocumentId> { ): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.ReadDocument)) {
return readDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext; const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint; const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/"); const idComponents = documentId.self.split("/");
@@ -217,7 +141,7 @@ export function readDocument(
: "", : "",
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.ReadDocument); const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window return window
.fetch(endpoint, { .fetch(endpoint, {
@@ -237,61 +161,12 @@ export function readDocument(
}); });
} }
export function readDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 4).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("readDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "GET",
headers: {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.PartitionKey]: encodeURIComponent(
JSON.stringify(documentId.partitionKeyHeader()),
),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "reading document", params);
});
}
export function createDocument( export function createDocument(
databaseId: string, databaseId: string,
collection: Collection, collection: Collection,
partitionKeyProperty: string, partitionKeyProperty: string,
documentContent: unknown, documentContent: unknown,
): Promise<DataModels.DocumentId> { ): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.CreateDocument)) {
return createDocument_ToBeDeprecated(databaseId, collection, partitionKeyProperty, documentContent);
}
const { databaseAccount } = userContext; const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint; const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = { const params = {
@@ -308,7 +183,7 @@ export function createDocument(
documentContent: JSON.stringify(documentContent), documentContent: JSON.stringify(documentContent),
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.CreateDocument); const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window return window
.fetch(`${endpoint}/createDocument`, { .fetch(`${endpoint}/createDocument`, {
@@ -328,54 +203,12 @@ export function createDocument(
}); });
} }
export function createDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
partitionKeyProperty: string,
documentContent: unknown,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk: collection && collection.partitionKey && !collection.partitionKey.systemKey ? partitionKeyProperty : "",
};
const endpoint = getFeatureEndpointOrDefault("createDocument");
return window
.fetch(`${endpoint}/resourcelist?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify(documentContent),
headers: {
...defaultHeaders,
...authHeaders(),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating document", params);
});
}
export function updateDocument( export function updateDocument(
databaseId: string, databaseId: string,
collection: Collection, collection: Collection,
documentId: DocumentId, documentId: DocumentId,
documentContent: string, documentContent: string,
): Promise<DataModels.DocumentId> { ): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.UpdateDocument)) {
return updateDocument_ToBeDeprecated(databaseId, collection, documentId, documentContent);
}
const { databaseAccount } = userContext; const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint; const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/"); const idComponents = documentId.self.split("/");
@@ -396,7 +229,7 @@ export function updateDocument(
: "", : "",
documentContent, documentContent,
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.UpdateDocument); const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window return window
.fetch(endpoint, { .fetch(endpoint, {
@@ -417,139 +250,6 @@ export function updateDocument(
}); });
} }
export function updateDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
documentContent: string,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("updateDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "PUT",
body: documentContent,
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "updating document", params);
});
}
export function deleteDocument(databaseId: string, collection: Collection, documentId: DocumentId): Promise<void> {
if (!useMongoProxyEndpoint(MongoProxyApi.DeleteDocument)) {
return deleteDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
databaseID: databaseId,
collectionID: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
resourceID: rid,
resourceType: "docs",
subscriptionID: userContext.subscriptionId,
resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name,
partitionKey:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.DeleteDocument);
return window
.fetch(endpoint, {
method: "DELETE",
body: JSON.stringify(params),
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<void> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("deleteDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "DELETE",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocuments( export function deleteDocuments(
databaseId: string, databaseId: string,
collection: Collection, collection: Collection,
@@ -575,7 +275,7 @@ export function deleteDocuments(
resourceGroup: userContext.resourceGroup, resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name, databaseAccountName: databaseAccount.name,
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.BulkDelete); const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window return window
.fetch(`${endpoint}/bulkdelete`, { .fetch(`${endpoint}/bulkdelete`, {
@@ -599,9 +299,6 @@ export function deleteDocuments(
export function createMongoCollectionWithProxy( export function createMongoCollectionWithProxy(
params: DataModels.CreateCollectionParams, params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> { ): Promise<DataModels.Collection> {
if (!useMongoProxyEndpoint(MongoProxyApi.CreateCollectionWithProxy)) {
return createMongoCollectionWithProxy_ToBeDeprecated(params);
}
const { databaseAccount } = userContext; const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0]; const shardKey: string = params.partitionKey?.paths[0];
@@ -622,7 +319,7 @@ export function createMongoCollectionWithProxy(
isSharded: !!shardKey, isSharded: !!shardKey,
}; };
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.CreateCollectionWithProxy); const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window return window
.fetch(`${endpoint}/createCollection`, { .fetch(`${endpoint}/createCollection`, {
@@ -642,70 +339,6 @@ export function createMongoCollectionWithProxy(
}); });
} }
export function createMongoCollectionWithProxy_ToBeDeprecated(
params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> {
const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0];
const mongoParams: DataModels.MongoParameters = {
resourceUrl: databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint,
db: params.databaseId,
coll: params.collectionId,
pk: shardKey,
offerThroughput: params.autoPilotMaxThroughput || params.offerThroughput,
cd: params.createNewDatabase,
st: params.databaseLevelThroughput,
is: !!shardKey,
rid: "",
rtype: "colls",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
isAutoPilot: !!params.autoPilotMaxThroughput,
};
const endpoint = getFeatureEndpointOrDefault("createCollectionWithProxy");
return window
.fetch(
`${endpoint}/createCollection?${queryString.stringify(
mongoParams as unknown as queryString.ParsedUrlQueryInput,
)}`,
{
method: "POST",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: "application/json",
},
},
)
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating collection", mongoParams);
});
}
export function getFeatureEndpointOrDefault(feature: string): string {
let endpoint;
if (useMongoProxyEndpoint(feature)) {
endpoint = configContext.MONGO_PROXY_ENDPOINT;
} else {
const allowedMongoProxyEndpoints = configContext.allowedMongoProxyEndpoints || [
...defaultAllowedMongoProxyEndpoints,
...allowedMongoProxyEndpoints_ToBeDeprecated,
];
endpoint =
hasFlag(userContext.features.mongoProxyAPIs, feature) &&
validateEndpoint(userContext.features.mongoProxyEndpoint, allowedMongoProxyEndpoints)
? userContext.features.mongoProxyEndpoint
: configContext.MONGO_BACKEND_ENDPOINT || configContext.BACKEND_ENDPOINT;
}
return getEndpoint(endpoint);
}
export function getEndpoint(endpoint: string): string { export function getEndpoint(endpoint: string): string {
let url = endpoint + "/api/mongo/explorer"; let url = endpoint + "/api/mongo/explorer";
@@ -719,84 +352,6 @@ export function getEndpoint(endpoint: string): string {
return url; return url;
} }
export function useMongoProxyEndpoint(mongoProxyApi: string): boolean {
const mongoProxyEnvironmentMap: { [key: string]: string[] } = {
[MongoProxyApi.ResourceList]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.QueryDocuments]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.CreateDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.ReadDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.UpdateDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.DeleteDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.CreateCollectionWithProxy]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.LegacyMongoShell]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.BulkDelete]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
};
if (!mongoProxyEnvironmentMap[mongoProxyApi] || !configContext.MONGO_PROXY_ENDPOINT) {
return false;
}
if (configContext.globallyEnabledMongoAPIs.includes(mongoProxyApi)) {
return true;
}
return mongoProxyEnvironmentMap[mongoProxyApi].includes(configContext.MONGO_PROXY_ENDPOINT);
}
export class ThrottlingError extends Error { export class ThrottlingError extends Error {
constructor(message: string) { constructor(message: string) {
super(message); super(message);

View File

@@ -1,4 +1,4 @@
// import { QueryOperationOptions } from "@azure/cosmos"; import { QueryOperationOptions } from "@azure/cosmos";
import { QueryResults } from "../../Contracts/ViewModels"; import { QueryResults } from "../../Contracts/ViewModels";
import { logConsoleInfo, logConsoleProgress } from "../../Utils/NotificationConsoleUtils"; import { logConsoleInfo, logConsoleProgress } from "../../Utils/NotificationConsoleUtils";
import { getEntityName } from "../DocumentUtility"; import { getEntityName } from "../DocumentUtility";
@@ -9,13 +9,13 @@ export const queryDocumentsPage = async (
resourceName: string, resourceName: string,
documentsIterator: MinimalQueryIterator, documentsIterator: MinimalQueryIterator,
firstItemIndex: number, firstItemIndex: number,
// queryOperationOptions?: QueryOperationOptions, queryOperationOptions?: QueryOperationOptions,
): Promise<QueryResults> => { ): Promise<QueryResults> => {
const entityName = getEntityName(); const entityName = getEntityName();
const clearMessage = logConsoleProgress(`Querying ${entityName} for container ${resourceName}`); const clearMessage = logConsoleProgress(`Querying ${entityName} for container ${resourceName}`);
try { try {
const result: QueryResults = await nextPage(documentsIterator, firstItemIndex); const result: QueryResults = await nextPage(documentsIterator, firstItemIndex, queryOperationOptions);
const itemCount = (result.documents && result.documents.length) || 0; const itemCount = (result.documents && result.documents.length) || 0;
logConsoleInfo(`Successfully fetched ${itemCount} ${entityName} for container ${resourceName}`); logConsoleInfo(`Successfully fetched ${itemCount} ${entityName} for container ${resourceName}`);
return result; return result;

View File

@@ -105,6 +105,8 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
? parseInt(resource.softAllowedMaximumThroughput) ? parseInt(resource.softAllowedMaximumThroughput)
: resource.softAllowedMaximumThroughput; : resource.softAllowedMaximumThroughput;
const throughputBuckets = resource?.throughputBuckets;
if (autoscaleSettings) { if (autoscaleSettings) {
return { return {
id: offerId, id: offerId,
@@ -114,6 +116,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true", offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput, instantMaximumThroughput,
softAllowedMaximumThroughput, softAllowedMaximumThroughput,
throughputBuckets,
}; };
} }
@@ -125,6 +128,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true", offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput, instantMaximumThroughput,
softAllowedMaximumThroughput, softAllowedMaximumThroughput,
throughputBuckets,
}; };
} }

View File

@@ -1,6 +1,6 @@
import { OfferDefinition, RequestOptions } from "@azure/cosmos"; import { OfferDefinition, RequestOptions } from "@azure/cosmos";
import { AuthType } from "../../AuthType"; import { AuthType } from "../../AuthType";
import { Offer, SDKOfferDefinition, UpdateOfferParams } from "../../Contracts/DataModels"; import { Offer, SDKOfferDefinition, ThroughputBucket, UpdateOfferParams } from "../../Contracts/DataModels";
import { userContext } from "../../UserContext"; import { userContext } from "../../UserContext";
import { import {
migrateCassandraKeyspaceToAutoscale, migrateCassandraKeyspaceToAutoscale,
@@ -359,6 +359,13 @@ const createUpdateOfferBody = (params: UpdateOfferParams): ThroughputSettingsUpd
body.properties.resource.throughput = params.manualThroughput; body.properties.resource.throughput = params.manualThroughput;
} }
if (params.throughputBuckets) {
const throughputBuckets = params.throughputBuckets.filter(
(bucket: ThroughputBucket) => bucket.maxThroughputPercentage !== 100,
);
body.properties.resource.throughputBuckets = throughputBuckets;
}
return body; return body;
}; };

View File

@@ -12,7 +12,6 @@ import {
allowedGraphEndpoints, allowedGraphEndpoints,
allowedHostedExplorerEndpoints, allowedHostedExplorerEndpoints,
allowedJunoOrigins, allowedJunoOrigins,
allowedMongoBackendEndpoints,
allowedMsalRedirectEndpoints, allowedMsalRedirectEndpoints,
defaultAllowedArmEndpoints, defaultAllowedArmEndpoints,
defaultAllowedBackendEndpoints, defaultAllowedBackendEndpoints,
@@ -50,10 +49,8 @@ export interface ConfigContext {
CATALOG_API_KEY: string; CATALOG_API_KEY: string;
ARCADIA_ENDPOINT: string; ARCADIA_ENDPOINT: string;
ARCADIA_LIVY_ENDPOINT_DNS_ZONE: string; ARCADIA_LIVY_ENDPOINT_DNS_ZONE: string;
BACKEND_ENDPOINT?: string;
PORTAL_BACKEND_ENDPOINT: string; PORTAL_BACKEND_ENDPOINT: string;
NEW_BACKEND_APIS?: BackendApi[]; NEW_BACKEND_APIS?: BackendApi[];
MONGO_BACKEND_ENDPOINT?: string;
MONGO_PROXY_ENDPOINT: string; MONGO_PROXY_ENDPOINT: string;
CASSANDRA_PROXY_ENDPOINT: string; CASSANDRA_PROXY_ENDPOINT: string;
NEW_CASSANDRA_APIS?: string[]; NEW_CASSANDRA_APIS?: string[];
@@ -91,7 +88,7 @@ let configContext: Readonly<ConfigContext> = {
`^https:\\/\\/.*\\.analysis-df\\.net$`, `^https:\\/\\/.*\\.analysis-df\\.net$`,
`^https:\\/\\/.*\\.analysis-df\\.windows\\.net$`, `^https:\\/\\/.*\\.analysis-df\\.windows\\.net$`,
`^https:\\/\\/.*\\.azure-test\\.net$`, `^https:\\/\\/.*\\.azure-test\\.net$`,
`^https:\\/\\/cosmos-explorer-preview\\.azurewebsites\\.net$`, `^https:\\/\\/dataexplorer-preview\\.azurewebsites\\.net$`,
], // Webpack injects this at build time ], // Webpack injects this at build time
gitSha: process.env.GIT_SHA, gitSha: process.env.GIT_SHA,
hostedExplorerURL: "https://cosmos.azure.com/", hostedExplorerURL: "https://cosmos.azure.com/",
@@ -109,7 +106,6 @@ let configContext: Readonly<ConfigContext> = {
GITHUB_CLIENT_ID: "6cb2f63cf6f7b5cbdeca", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1189306 GITHUB_CLIENT_ID: "6cb2f63cf6f7b5cbdeca", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1189306
GITHUB_TEST_ENV_CLIENT_ID: "b63fc8cbf87fd3c6e2eb", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1777772 GITHUB_TEST_ENV_CLIENT_ID: "b63fc8cbf87fd3c6e2eb", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1777772
JUNO_ENDPOINT: JunoEndpoints.Prod, JUNO_ENDPOINT: JunoEndpoints.Prod,
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Prod, PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Prod,
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod, MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
CASSANDRA_PROXY_ENDPOINT: CassandraProxyEndpoints.Prod, CASSANDRA_PROXY_ENDPOINT: CassandraProxyEndpoints.Prod,
@@ -152,15 +148,6 @@ export function updateConfigContext(newContext: Partial<ConfigContext>): void {
delete newContext.ARCADIA_ENDPOINT; delete newContext.ARCADIA_ENDPOINT;
} }
if (
!validateEndpoint(
newContext.BACKEND_ENDPOINT,
configContext.allowedBackendEndpoints || defaultAllowedBackendEndpoints,
)
) {
delete newContext.BACKEND_ENDPOINT;
}
if ( if (
!validateEndpoint( !validateEndpoint(
newContext.MONGO_PROXY_ENDPOINT, newContext.MONGO_PROXY_ENDPOINT,
@@ -170,10 +157,6 @@ export function updateConfigContext(newContext: Partial<ConfigContext>): void {
delete newContext.MONGO_PROXY_ENDPOINT; delete newContext.MONGO_PROXY_ENDPOINT;
} }
if (!validateEndpoint(newContext.MONGO_BACKEND_ENDPOINT, allowedMongoBackendEndpoints)) {
delete newContext.MONGO_BACKEND_ENDPOINT;
}
if ( if (
!validateEndpoint( !validateEndpoint(
newContext.CASSANDRA_PROXY_ENDPOINT, newContext.CASSANDRA_PROXY_ENDPOINT,

View File

@@ -6,6 +6,7 @@ export interface ArmEntity {
location: string; location: string;
type: string; type: string;
kind: string; kind: string;
tags?: Tags;
} }
export interface DatabaseAccount extends ArmEntity { export interface DatabaseAccount extends ArmEntity {
@@ -41,6 +42,11 @@ export interface DatabaseAccountExtendedProperties {
publicNetworkAccess?: string; publicNetworkAccess?: string;
enablePriorityBasedExecution?: boolean; enablePriorityBasedExecution?: boolean;
vcoreMongoEndpoint?: string; vcoreMongoEndpoint?: string;
virtualNetworkRules?: VNetRule[];
}
export interface VNetRule {
id: string;
} }
export interface DatabaseAccountResponseLocation { export interface DatabaseAccountResponseLocation {
@@ -274,6 +280,12 @@ export interface Offer {
offerReplacePending: boolean; offerReplacePending: boolean;
instantMaximumThroughput?: number; instantMaximumThroughput?: number;
softAllowedMaximumThroughput?: number; softAllowedMaximumThroughput?: number;
throughputBuckets?: ThroughputBucket[];
}
export interface ThroughputBucket {
id: number;
maxThroughputPercentage: number;
} }
export interface SDKOfferDefinition extends Resource { export interface SDKOfferDefinition extends Resource {
@@ -396,6 +408,7 @@ export interface UpdateOfferParams {
collectionId?: string; collectionId?: string;
migrateToAutoPilot?: boolean; migrateToAutoPilot?: boolean;
migrateToManual?: boolean; migrateToManual?: boolean;
throughputBuckets?: ThroughputBucket[];
} }
export interface Notification { export interface Notification {
@@ -663,3 +676,5 @@ export interface FeatureRegistration {
state: string; state: string;
}; };
} }
export type Tags = { [key: string]: string };

View File

@@ -41,7 +41,7 @@ export enum MessageTypes {
OpenPostgreSQLPasswordReset, OpenPostgreSQLPasswordReset,
OpenPostgresNetworkingBlade, OpenPostgresNetworkingBlade,
OpenCosmosDBNetworkingBlade, OpenCosmosDBNetworkingBlade,
DisplayNPSSurvey, DisplayNPSSurvey, // unused
OpenVCoreMongoNetworkingBlade, OpenVCoreMongoNetworkingBlade,
OpenVCoreMongoConnectionStringsBlade, OpenVCoreMongoConnectionStringsBlade,
GetAuthorizationToken, // unused. Can be removed if the portal uses the same list of enums. GetAuthorizationToken, // unused. Can be removed if the portal uses the same list of enums.

View File

@@ -406,7 +406,6 @@ export interface DataExplorerInputsFrame {
csmEndpoint?: string; csmEndpoint?: string;
dnsSuffix?: string; dnsSuffix?: string;
serverId?: string; serverId?: string;
extensionEndpoint?: string;
portalBackendEndpoint?: string; portalBackendEndpoint?: string;
mongoProxyEndpoint?: string; mongoProxyEndpoint?: string;
cassandraProxyEndpoint?: string; cassandraProxyEndpoint?: string;

View File

@@ -96,17 +96,17 @@ export const createCollectionContextMenuButton = (
iconSrc: HostedTerminalIcon, iconSrc: HostedTerminalIcon,
onClick: () => { onClick: () => {
const selectedCollection: ViewModels.Collection = useSelectedNode.getState().findSelectedCollection(); const selectedCollection: ViewModels.Collection = useSelectedNode.getState().findSelectedCollection();
if (useNotebook.getState().isShellEnabled) { if (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) {
container.openNotebookTerminal(ViewModels.TerminalKind.Mongo); container.openNotebookTerminal(ViewModels.TerminalKind.Mongo);
} else { } else {
selectedCollection && selectedCollection.onNewMongoShellClick(); selectedCollection && selectedCollection.onNewMongoShellClick();
} }
}, },
label: useNotebook.getState().isShellEnabled ? "Open Mongo Shell" : "New Shell", label: (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) ? "Open Mongo Shell" : "New Shell",
}); });
} }
if (useNotebook.getState().isShellEnabled && userContext.apiType === "Cassandra") { if ((useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) && userContext.apiType === "Cassandra") {
items.push({ items.push({
iconSrc: HostedTerminalIcon, iconSrc: HostedTerminalIcon,
onClick: () => { onClick: () => {

View File

@@ -1,5 +1,7 @@
import { AuthType } from "AuthType";
import { shallow } from "enzyme"; import { shallow } from "enzyme";
import ko from "knockout"; import ko from "knockout";
import { Features } from "Platform/Hosted/extractFeatures";
import React from "react"; import React from "react";
import { updateCollection } from "../../../Common/dataAccess/updateCollection"; import { updateCollection } from "../../../Common/dataAccess/updateCollection";
import { updateOffer } from "../../../Common/dataAccess/updateOffer"; import { updateOffer } from "../../../Common/dataAccess/updateOffer";
@@ -247,4 +249,42 @@ describe("SettingsComponent", () => {
expect(conflictResolutionPolicy.mode).toEqual(DataModels.ConflictResolutionMode.Custom); expect(conflictResolutionPolicy.mode).toEqual(DataModels.ConflictResolutionMode.Custom);
expect(conflictResolutionPolicy.conflictResolutionProcedure).toEqual(expectSprocPath); expect(conflictResolutionPolicy.conflictResolutionProcedure).toEqual(expectSprocPath);
}); });
it("should save throughput bucket changes when Save button is clicked", async () => {
updateUserContext({
apiType: "SQL",
features: { enableThroughputBuckets: true } as Features,
authType: AuthType.AAD,
});
const wrapper = shallow(<SettingsComponent {...baseProps} />);
const settingsComponentInstance = wrapper.instance() as SettingsComponent;
const isEnabled = settingsComponentInstance["throughputBucketsEnabled"];
expect(isEnabled).toBe(true);
wrapper.setState({
isThroughputBucketsSaveable: true,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
await settingsComponentInstance.onSaveClick();
expect(updateOffer).toHaveBeenCalledWith({
databaseId: collection.databaseId,
collectionId: collection.id(),
currentOffer: expect.any(Object),
autopilotThroughput: collection.offer().autoscaleMaxThroughput,
manualThroughput: collection.offer().manualThroughput,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
expect(wrapper.state("isThroughputBucketsSaveable")).toBe(false);
});
}); });

View File

@@ -7,6 +7,10 @@ import {
ContainerPolicyComponent, ContainerPolicyComponent,
ContainerPolicyComponentProps, ContainerPolicyComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ContainerPolicyComponent"; } from "Explorer/Controls/Settings/SettingsSubComponents/ContainerPolicyComponent";
import {
ThroughputBucketsComponent,
ThroughputBucketsComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ThroughputInputComponents/ThroughputBucketsComponent";
import { useDatabases } from "Explorer/useDatabases"; import { useDatabases } from "Explorer/useDatabases";
import { isFullTextSearchEnabled, isVectorSearchEnabled } from "Utils/CapabilityUtils"; import { isFullTextSearchEnabled, isVectorSearchEnabled } from "Utils/CapabilityUtils";
import { isRunningOnPublicCloud } from "Utils/CloudUtils"; import { isRunningOnPublicCloud } from "Utils/CloudUtils";
@@ -86,6 +90,8 @@ export interface SettingsComponentState {
wasAutopilotOriginallySet: boolean; wasAutopilotOriginallySet: boolean;
isScaleSaveable: boolean; isScaleSaveable: boolean;
isScaleDiscardable: boolean; isScaleDiscardable: boolean;
throughputBuckets: DataModels.ThroughputBucket[];
throughputBucketsBaseline: DataModels.ThroughputBucket[];
throughputError: string; throughputError: string;
timeToLive: TtlType; timeToLive: TtlType;
@@ -104,6 +110,7 @@ export interface SettingsComponentState {
changeFeedPolicyBaseline: ChangeFeedPolicyState; changeFeedPolicyBaseline: ChangeFeedPolicyState;
isSubSettingsSaveable: boolean; isSubSettingsSaveable: boolean;
isSubSettingsDiscardable: boolean; isSubSettingsDiscardable: boolean;
isThroughputBucketsSaveable: boolean;
vectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy; vectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy;
vectorEmbeddingPolicyBaseline: DataModels.VectorEmbeddingPolicy; vectorEmbeddingPolicyBaseline: DataModels.VectorEmbeddingPolicy;
@@ -158,6 +165,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private isVectorSearchEnabled: boolean; private isVectorSearchEnabled: boolean;
private isFullTextSearchEnabled: boolean; private isFullTextSearchEnabled: boolean;
private totalThroughputUsed: number; private totalThroughputUsed: number;
private throughputBucketsEnabled: boolean;
public mongoDBCollectionResource: MongoDBCollectionResource; public mongoDBCollectionResource: MongoDBCollectionResource;
constructor(props: SettingsComponentProps) { constructor(props: SettingsComponentProps) {
@@ -175,6 +183,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.isFullTextSearchEnabled = isFullTextSearchEnabled() && !hasDatabaseSharedThroughput(this.collection); this.isFullTextSearchEnabled = isFullTextSearchEnabled() && !hasDatabaseSharedThroughput(this.collection);
this.changeFeedPolicyVisible = userContext.features.enableChangeFeedPolicy; this.changeFeedPolicyVisible = userContext.features.enableChangeFeedPolicy;
this.throughputBucketsEnabled =
userContext.apiType === "SQL" &&
userContext.features.enableThroughputBuckets &&
userContext.authType === AuthType.AAD;
// Mongo container with system partition key still treat as "Fixed" // Mongo container with system partition key still treat as "Fixed"
this.isFixedContainer = this.isFixedContainer =
@@ -193,6 +205,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
wasAutopilotOriginallySet: false, wasAutopilotOriginallySet: false,
isScaleSaveable: false, isScaleSaveable: false,
isScaleDiscardable: false, isScaleDiscardable: false,
throughputBuckets: undefined,
throughputBucketsBaseline: undefined,
throughputError: undefined, throughputError: undefined,
timeToLive: undefined, timeToLive: undefined,
@@ -211,6 +225,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
changeFeedPolicyBaseline: undefined, changeFeedPolicyBaseline: undefined,
isSubSettingsSaveable: false, isSubSettingsSaveable: false,
isSubSettingsDiscardable: false, isSubSettingsDiscardable: false,
isThroughputBucketsSaveable: false,
vectorEmbeddingPolicy: undefined, vectorEmbeddingPolicy: undefined,
vectorEmbeddingPolicyBaseline: undefined, vectorEmbeddingPolicyBaseline: undefined,
@@ -327,7 +342,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.state.isIndexingPolicyDirty || this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty || this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty || this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable) (!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable) ||
this.state.isThroughputBucketsSaveable
); );
}; };
@@ -339,7 +355,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.state.isIndexingPolicyDirty || this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty || this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty || this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable) (!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable) ||
this.state.isThroughputBucketsSaveable
); );
}; };
@@ -419,6 +436,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({ this.setState({
throughput: this.state.throughputBaseline, throughput: this.state.throughputBaseline,
throughputBuckets: this.state.throughputBucketsBaseline,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
timeToLive: this.state.timeToLiveBaseline, timeToLive: this.state.timeToLiveBaseline,
timeToLiveSeconds: this.state.timeToLiveSecondsBaseline, timeToLiveSeconds: this.state.timeToLiveSecondsBaseline,
displayedTtlSeconds: this.state.displayedTtlSecondsBaseline, displayedTtlSeconds: this.state.displayedTtlSecondsBaseline,
@@ -441,6 +460,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
isScaleSaveable: false, isScaleSaveable: false,
isScaleDiscardable: false, isScaleDiscardable: false,
isSubSettingsSaveable: false, isSubSettingsSaveable: false,
isThroughputBucketsSaveable: false,
isSubSettingsDiscardable: false, isSubSettingsDiscardable: false,
isContainerPolicyDirty: false, isContainerPolicyDirty: false,
isIndexingPolicyDirty: false, isIndexingPolicyDirty: false,
@@ -479,6 +499,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private onIndexingPolicyContentChange = (newIndexingPolicy: DataModels.IndexingPolicy): void => private onIndexingPolicyContentChange = (newIndexingPolicy: DataModels.IndexingPolicy): void =>
this.setState({ indexingPolicyContent: newIndexingPolicy }); this.setState({ indexingPolicyContent: newIndexingPolicy });
private onThroughputBucketsSaveableChange = (isSaveable: boolean): void => {
this.setState({ isThroughputBucketsSaveable: isSaveable });
};
private resetShouldDiscardContainerPolicies = (): void => this.setState({ shouldDiscardContainerPolicies: false }); private resetShouldDiscardContainerPolicies = (): void => this.setState({ shouldDiscardContainerPolicies: false });
private resetShouldDiscardIndexingPolicy = (): void => this.setState({ shouldDiscardIndexingPolicy: false }); private resetShouldDiscardIndexingPolicy = (): void => this.setState({ shouldDiscardIndexingPolicy: false });
@@ -749,9 +773,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
] as DataModels.ComputedProperties; ] as DataModels.ComputedProperties;
} }
const throughputBuckets = this.offer?.throughputBuckets;
return { return {
throughput: offerThroughput, throughput: offerThroughput,
throughputBaseline: offerThroughput, throughputBaseline: offerThroughput,
throughputBuckets,
throughputBucketsBaseline: throughputBuckets,
changeFeedPolicy: changeFeedPolicy, changeFeedPolicy: changeFeedPolicy,
changeFeedPolicyBaseline: changeFeedPolicy, changeFeedPolicyBaseline: changeFeedPolicy,
timeToLive: timeToLive, timeToLive: timeToLive,
@@ -839,6 +867,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({ throughput: newThroughput, throughputError }); this.setState({ throughput: newThroughput, throughputError });
}; };
private onThroughputBucketChange = (throughputBuckets: DataModels.ThroughputBucket[]): void => {
this.setState({ throughputBuckets });
};
private onAutoPilotSelected = (isAutoPilotSelected: boolean): void => private onAutoPilotSelected = (isAutoPilotSelected: boolean): void =>
this.setState({ isAutoPilotSelected: isAutoPilotSelected }); this.setState({ isAutoPilotSelected: isAutoPilotSelected });
@@ -1029,6 +1061,24 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
} }
} }
if (this.throughputBucketsEnabled && this.state.isThroughputBucketsSaveable) {
const updatedOffer: DataModels.Offer = await updateOffer({
databaseId: this.collection.databaseId,
collectionId: this.collection.id(),
currentOffer: this.collection.offer(),
autopilotThroughput: this.collection.offer().autoscaleMaxThroughput
? this.collection.offer().autoscaleMaxThroughput
: undefined,
manualThroughput: this.collection.offer().manualThroughput
? this.collection.offer().manualThroughput
: undefined,
throughputBuckets: this.state.throughputBuckets,
});
this.collection.offer(updatedOffer);
this.offer = updatedOffer;
this.setState({ isThroughputBucketsSaveable: false });
}
if (this.state.isScaleSaveable) { if (this.state.isScaleSaveable) {
const updateOfferParams: DataModels.UpdateOfferParams = { const updateOfferParams: DataModels.UpdateOfferParams = {
databaseId: this.collection.databaseId, databaseId: this.collection.databaseId,
@@ -1209,6 +1259,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
onConflictResolutionDirtyChange: this.onConflictResolutionDirtyChange, onConflictResolutionDirtyChange: this.onConflictResolutionDirtyChange,
}; };
const throughputBucketsComponentProps: ThroughputBucketsComponentProps = {
currentBuckets: this.state.throughputBuckets,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
onBucketsChange: this.onThroughputBucketChange,
onSaveableChange: this.onThroughputBucketsSaveableChange,
};
const partitionKeyComponentProps: PartitionKeyComponentProps = { const partitionKeyComponentProps: PartitionKeyComponentProps = {
database: useDatabases.getState().findDatabaseWithId(this.collection.databaseId), database: useDatabases.getState().findDatabaseWithId(this.collection.databaseId),
collection: this.collection, collection: this.collection,
@@ -1271,6 +1328,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
}); });
} }
if (this.throughputBucketsEnabled) {
tabs.push({
tab: SettingsV2TabTypes.ThroughputBucketsTab,
content: <ThroughputBucketsComponent {...throughputBucketsComponentProps} />,
});
}
const pivotProps: IPivotProps = { const pivotProps: IPivotProps = {
onLinkClick: this.onPivotChange, onLinkClick: this.onPivotChange,
selectedKey: SettingsV2TabTypes[this.state.selectedTab], selectedKey: SettingsV2TabTypes[this.state.selectedTab],

View File

@@ -14,6 +14,7 @@ import * as ViewModels from "../../../../Contracts/ViewModels";
import { handleError } from "Common/ErrorHandlingUtils"; import { handleError } from "Common/ErrorHandlingUtils";
import { cancelDataTransferJob, pollDataTransferJob } from "Common/dataAccess/dataTransfers"; import { cancelDataTransferJob, pollDataTransferJob } from "Common/dataAccess/dataTransfers";
import { Platform, configContext } from "ConfigContext";
import Explorer from "Explorer/Explorer"; import Explorer from "Explorer/Explorer";
import { ChangePartitionKeyPane } from "Explorer/Panes/ChangePartitionKeyPane/ChangePartitionKeyPane"; import { ChangePartitionKeyPane } from "Explorer/Panes/ChangePartitionKeyPane/ChangePartitionKeyPane";
import { import {
@@ -177,12 +178,14 @@ export const PartitionKeyComponent: React.FC<PartitionKeyComponentProps> = ({ da
To change the partition key, a new destination container must be created or an existing destination container To change the partition key, a new destination container must be created or an existing destination container
selected. Data will then be copied to the destination container. selected. Data will then be copied to the destination container.
</Text> </Text>
<PrimaryButton {configContext.platform !== Platform.Emulator && (
styles={{ root: { width: "fit-content" } }} <PrimaryButton
text="Change" styles={{ root: { width: "fit-content" } }}
onClick={startPartitionkeyChangeWorkflow} text="Change"
disabled={isCurrentJobInProgress(portalDataTransferJob)} onClick={startPartitionkeyChangeWorkflow}
/> disabled={isCurrentJobInProgress(portalDataTransferJob)}
/>
)}
{portalDataTransferJob && ( {portalDataTransferJob && (
<Stack> <Stack>
<Text styles={textHeadingStyle}>{partitionKeyName} change job</Text> <Text styles={textHeadingStyle}>{partitionKeyName} change job</Text>

View File

@@ -0,0 +1,177 @@
import "@testing-library/jest-dom";
import { fireEvent, render, screen } from "@testing-library/react";
import React from "react";
import { ThroughputBucketsComponent } from "./ThroughputBucketsComponent";
describe("ThroughputBucketsComponent", () => {
const mockOnBucketsChange = jest.fn();
const mockOnSaveableChange = jest.fn();
const defaultProps = {
currentBuckets: [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
],
throughputBucketsBaseline: [
{ id: 1, maxThroughputPercentage: 40 },
{ id: 2, maxThroughputPercentage: 50 },
],
onBucketsChange: mockOnBucketsChange,
onSaveableChange: mockOnSaveableChange,
};
beforeEach(() => {
jest.clearAllMocks();
});
it("renders the correct number of buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
});
it("renders buckets in the correct order even if input is unordered", () => {
const unorderedBuckets = [
{ id: 2, maxThroughputPercentage: 60 },
{ id: 1, maxThroughputPercentage: 50 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={unorderedBuckets} />);
const bucketLabels = screen.getAllByText(/Group \d+/).map((el) => el.textContent);
expect(bucketLabels).toEqual(["Group 1 (Data Explorer Query Bucket)", "Group 2", "Group 3", "Group 4", "Group 5"]);
});
it("renders all provided buckets even if they exceed the max default bucket count", () => {
const oversizedBuckets = [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 70 },
{ id: 4, maxThroughputPercentage: 80 },
{ id: 5, maxThroughputPercentage: 90 },
{ id: 6, maxThroughputPercentage: 100 },
{ id: 7, maxThroughputPercentage: 40 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={oversizedBuckets} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(7);
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
expect(screen.getByDisplayValue("60")).toBeInTheDocument();
expect(screen.getByDisplayValue("70")).toBeInTheDocument();
expect(screen.getByDisplayValue("80")).toBeInTheDocument();
expect(screen.getByDisplayValue("90")).toBeInTheDocument();
expect(screen.getByDisplayValue("100")).toBeInTheDocument();
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
});
it("calls onBucketsChange when a bucket value changes", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "70" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("triggers onSaveableChange when values change", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "80" } });
expect(mockOnSaveableChange).toHaveBeenCalledWith(true);
});
it("updates state consistently after multiple changes to different buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
const input2 = screen.getByDisplayValue("60");
fireEvent.change(input2, { target: { value: "80" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 80 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("resets to baseline when currentBuckets are reset", () => {
const { rerender } = render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
rerender(<ThroughputBucketsComponent {...defaultProps} currentBuckets={defaultProps.throughputBucketsBaseline} />);
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
});
it("does not call onBucketsChange when value remains unchanged", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "50" } });
expect(mockOnBucketsChange).not.toHaveBeenCalled();
});
it("disables input and slider when maxThroughputPercentage is 100", () => {
render(
<ThroughputBucketsComponent
{...defaultProps}
currentBuckets={[
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 50 },
]}
/>,
);
const disabledInputs = screen.getAllByDisplayValue("100");
expect(disabledInputs.length).toBeGreaterThan(0);
expect(disabledInputs[0]).toBeDisabled();
const sliders = screen.getAllByRole("slider");
expect(sliders.length).toBeGreaterThan(0);
expect(sliders[0]).toHaveAttribute("aria-disabled", "true");
expect(sliders[1]).toHaveAttribute("aria-disabled", "false");
});
it("toggles bucket value between 50 and 100 with switch", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const toggles = screen.getAllByRole("switch");
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("ensures default buckets are used when no buckets are provided", () => {
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={[]} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
expect(screen.getAllByDisplayValue("100")).toHaveLength(5);
});
});

View File

@@ -0,0 +1,105 @@
import { Label, Slider, Stack, TextField, Toggle } from "@fluentui/react";
import { ThroughputBucket } from "Contracts/DataModels";
import React, { FC, useEffect, useState } from "react";
import { isDirty } from "../../SettingsUtils";
const MAX_BUCKET_SIZES = 5;
const DEFAULT_BUCKETS = Array.from({ length: MAX_BUCKET_SIZES }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
export interface ThroughputBucketsComponentProps {
currentBuckets: ThroughputBucket[];
throughputBucketsBaseline: ThroughputBucket[];
onBucketsChange: (updatedBuckets: ThroughputBucket[]) => void;
onSaveableChange: (isSaveable: boolean) => void;
}
export const ThroughputBucketsComponent: FC<ThroughputBucketsComponentProps> = ({
currentBuckets,
throughputBucketsBaseline,
onBucketsChange,
onSaveableChange,
}) => {
const getThroughputBuckets = (buckets: ThroughputBucket[]): ThroughputBucket[] => {
if (!buckets || buckets.length === 0) {
return DEFAULT_BUCKETS;
}
const maxBuckets = Math.max(DEFAULT_BUCKETS.length, buckets.length);
const adjustedDefaultBuckets = Array.from({ length: maxBuckets }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
return adjustedDefaultBuckets.map(
(defaultBucket) => buckets?.find((bucket) => bucket.id === defaultBucket.id) || defaultBucket,
);
};
const [throughputBuckets, setThroughputBuckets] = useState<ThroughputBucket[]>(getThroughputBuckets(currentBuckets));
useEffect(() => {
setThroughputBuckets(getThroughputBuckets(currentBuckets));
onSaveableChange(false);
}, [currentBuckets]);
useEffect(() => {
const isChanged = isDirty(throughputBuckets, getThroughputBuckets(throughputBucketsBaseline));
onSaveableChange(isChanged);
}, [throughputBuckets]);
const handleBucketChange = (id: number, newValue: number) => {
const updatedBuckets = throughputBuckets.map((bucket) =>
bucket.id === id ? { ...bucket, maxThroughputPercentage: newValue } : bucket,
);
setThroughputBuckets(updatedBuckets);
const settingsChanged = isDirty(updatedBuckets, throughputBuckets);
settingsChanged && onBucketsChange(updatedBuckets);
};
const onToggle = (id: number, checked: boolean) => {
handleBucketChange(id, checked ? 50 : 100);
};
return (
<Stack tokens={{ childrenGap: "m" }} styles={{ root: { width: "70%", maxWidth: 700 } }}>
<Label>Throughput Buckets</Label>
<Stack>
{throughputBuckets?.map((bucket) => (
<Stack key={bucket.id} horizontal tokens={{ childrenGap: 8 }} verticalAlign="center">
<Slider
min={1}
max={100}
step={1}
value={bucket.maxThroughputPercentage}
onChange={(newValue) => handleBucketChange(bucket.id, newValue)}
showValue={false}
label={`Group ${bucket.id}${bucket.id === 1 ? " (Data Explorer Query Bucket)" : ""}`}
styles={{ root: { flex: 2, maxWidth: 400 } }}
disabled={bucket.maxThroughputPercentage === 100}
/>
<TextField
value={bucket.maxThroughputPercentage.toString()}
onChange={(event, newValue) => handleBucketChange(bucket.id, parseInt(newValue || "0", 10))}
type="number"
suffix="%"
styles={{
fieldGroup: { width: 80 },
}}
disabled={bucket.maxThroughputPercentage === 100}
/>
<Toggle
onText="Active"
offText="Inactive"
checked={bucket.maxThroughputPercentage !== 100}
onChange={(event, checked) => onToggle(bucket.id, checked)}
styles={{ root: { marginBottom: 0 }, text: { fontSize: 12 } }}
></Toggle>
</Stack>
))}
</Stack>
</Stack>
);
};

View File

@@ -17,14 +17,13 @@ import {
} from "@fluentui/react"; } from "@fluentui/react";
import React from "react"; import React from "react";
import * as DataModels from "../../../../../Contracts/DataModels"; import * as DataModels from "../../../../../Contracts/DataModels";
import { SubscriptionType } from "../../../../../Contracts/SubscriptionType";
import * as SharedConstants from "../../../../../Shared/Constants"; import * as SharedConstants from "../../../../../Shared/Constants";
import { Action, ActionModifiers } from "../../../../../Shared/Telemetry/TelemetryConstants"; import { Action, ActionModifiers } from "../../../../../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../../../../../Shared/Telemetry/TelemetryProcessor"; import * as TelemetryProcessor from "../../../../../Shared/Telemetry/TelemetryProcessor";
import { userContext } from "../../../../../UserContext"; import { userContext } from "../../../../../UserContext";
import * as AutoPilotUtils from "../../../../../Utils/AutoPilotUtils"; import * as AutoPilotUtils from "../../../../../Utils/AutoPilotUtils";
import { autoPilotThroughput1K } from "../../../../../Utils/AutoPilotUtils"; import { autoPilotThroughput1K } from "../../../../../Utils/AutoPilotUtils";
import { calculateEstimateNumber, usageInGB } from "../../../../../Utils/PricingUtils"; import { calculateEstimateNumber } from "../../../../../Utils/PricingUtils";
import { Int32 } from "../../../../Panes/Tables/Validators/EntityPropertyValidationCommon"; import { Int32 } from "../../../../Panes/Tables/Validators/EntityPropertyValidationCommon";
import { import {
PriceBreakdown, PriceBreakdown,
@@ -366,29 +365,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
}); });
}; };
private minRUperGBSurvey = (): JSX.Element => {
const href = `https://ncv.microsoft.com/vRBTO37jmO?ctx={"AzureSubscriptionId":"${userContext.subscriptionId}","CosmosDBAccountName":"${userContext.databaseAccount?.name}"}`;
const oneTBinKB = 1000000000;
const minRUperGB = 10;
const featureFlagEnabled = userContext.features.showMinRUSurvey;
const collectionIsEligible =
userContext.subscriptionType !== SubscriptionType.Internal &&
this.props.usageSizeInKB > oneTBinKB &&
this.props.minimum >= usageInGB(this.props.usageSizeInKB) * minRUperGB;
if (featureFlagEnabled || collectionIsEligible) {
return (
<Text>
Need to scale below {this.props.minimum} RU/s? Reach out by filling{" "}
<a target="_blank" rel="noreferrer" href={href}>
this questionnaire
</a>
.
</Text>
);
}
return undefined;
};
private renderThroughputModeChoices = (): JSX.Element => { private renderThroughputModeChoices = (): JSX.Element => {
const labelId = "settingsV2RadioButtonLabelId"; const labelId = "settingsV2RadioButtonLabelId";
return ( return (
@@ -661,7 +637,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
</Link> </Link>
</Text> </Text>
)} )}
{this.minRUperGBSurvey()}
{this.props.spendAckVisible && ( {this.props.spendAckVisible && (
<Checkbox <Checkbox
id="spendAckCheckBox" id="spendAckCheckBox"

View File

@@ -11,7 +11,8 @@ export type isDirtyTypes =
| DataModels.IndexingPolicy | DataModels.IndexingPolicy
| DataModels.ComputedProperties | DataModels.ComputedProperties
| DataModels.VectorEmbedding[] | DataModels.VectorEmbedding[]
| DataModels.FullTextPolicy; | DataModels.FullTextPolicy
| DataModels.ThroughputBucket[];
export const TtlOff = "off"; export const TtlOff = "off";
export const TtlOn = "on"; export const TtlOn = "on";
export const TtlOnNoDefault = "on-nodefault"; export const TtlOnNoDefault = "on-nodefault";
@@ -55,6 +56,7 @@ export enum SettingsV2TabTypes {
PartitionKeyTab, PartitionKeyTab,
ComputedPropertiesTab, ComputedPropertiesTab,
ContainerVectorPolicyTab, ContainerVectorPolicyTab,
ThroughputBucketsTab,
} }
export enum ContainerPolicyTabTypes { export enum ContainerPolicyTabTypes {
@@ -167,6 +169,8 @@ export const getTabTitle = (tab: SettingsV2TabTypes): string => {
return "Computed Properties"; return "Computed Properties";
case SettingsV2TabTypes.ContainerVectorPolicyTab: case SettingsV2TabTypes.ContainerVectorPolicyTab:
return "Container Policies"; return "Container Policies";
case SettingsV2TabTypes.ThroughputBucketsTab:
return "Throughput Buckets";
default: default:
throw new Error(`Unknown tab ${tab}`); throw new Error(`Unknown tab ${tab}`);
} }

View File

@@ -1,4 +1,5 @@
import { Checkbox, DirectionalHint, Link, Stack, Text, TextField, TooltipHost } from "@fluentui/react"; import { Checkbox, DirectionalHint, Link, Stack, Text, TextField, TooltipHost } from "@fluentui/react";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { useDatabases } from "Explorer/useDatabases"; import { useDatabases } from "Explorer/useDatabases";
import React, { FunctionComponent, useEffect, useState } from "react"; import React, { FunctionComponent, useEffect, useState } from "react";
import * as Constants from "../../../Common/Constants"; import * as Constants from "../../../Common/Constants";
@@ -34,10 +35,15 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
setIsThroughputCapExceeded, setIsThroughputCapExceeded,
onCostAcknowledgeChange, onCostAcknowledgeChange,
}: ThroughputInputProps) => { }: ThroughputInputProps) => {
const defaultThroughput: number =
isFreeTier ||
isQuickstart ||
[Constants.WorkloadType.Learning, Constants.WorkloadType.DevelopmentTesting].includes(getWorkloadType())
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
const [isAutoscaleSelected, setIsAutoScaleSelected] = useState<boolean>(true); const [isAutoscaleSelected, setIsAutoScaleSelected] = useState<boolean>(true);
const [throughput, setThroughput] = useState<number>( const [throughput, setThroughput] = useState<number>(defaultThroughput);
isFreeTier || isQuickstart ? AutoPilotUtils.autoPilotThroughput1K : AutoPilotUtils.autoPilotThroughput4K,
);
const [isCostAcknowledged, setIsCostAcknowledged] = useState<boolean>(false); const [isCostAcknowledged, setIsCostAcknowledged] = useState<boolean>(false);
const [throughputError, setThroughputError] = useState<string>(""); const [throughputError, setThroughputError] = useState<string>("");
const [totalThroughputUsed, setTotalThroughputUsed] = useState<number>(0); const [totalThroughputUsed, setTotalThroughputUsed] = useState<number>(0);
@@ -47,7 +53,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const throughputCap = userContext.databaseAccount?.properties.capacity?.totalThroughputLimit; const throughputCap = userContext.databaseAccount?.properties.capacity?.totalThroughputLimit;
const numberOfRegions = userContext.databaseAccount?.properties.locations?.length || 1; const numberOfRegions = userContext.databaseAccount?.properties.locations?.length || 1;
useEffect(() => { useEffect(() => {
// throughput cap check for the initial state // throughput cap check for the initial state
let totalThroughput = 0; let totalThroughput = 0;
@@ -157,9 +162,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const handleOnChangeMode = (event: React.ChangeEvent<HTMLInputElement>, mode: string): void => { const handleOnChangeMode = (event: React.ChangeEvent<HTMLInputElement>, mode: string): void => {
if (mode === "Autoscale") { if (mode === "Autoscale") {
const defaultThroughput = isFreeTier
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
setThroughput(defaultThroughput); setThroughput(defaultThroughput);
setIsAutoScaleSelected(true); setIsAutoScaleSelected(true);
setThroughputValue(defaultThroughput); setThroughputValue(defaultThroughput);

View File

@@ -35,7 +35,7 @@ import { PhoenixClient } from "../Phoenix/PhoenixClient";
import * as ExplorerSettings from "../Shared/ExplorerSettings"; import * as ExplorerSettings from "../Shared/ExplorerSettings";
import { Action, ActionModifiers } from "../Shared/Telemetry/TelemetryConstants"; import { Action, ActionModifiers } from "../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../Shared/Telemetry/TelemetryProcessor"; import * as TelemetryProcessor from "../Shared/Telemetry/TelemetryProcessor";
import { isAccountNewerThanThresholdInMs, updateUserContext, userContext } from "../UserContext"; import { updateUserContext, userContext } from "../UserContext";
import { getCollectionName, getUploadName } from "../Utils/APITypeUtils"; import { getCollectionName, getUploadName } from "../Utils/APITypeUtils";
import { stringToBlob } from "../Utils/BlobUtils"; import { stringToBlob } from "../Utils/BlobUtils";
import { isCapabilityEnabled } from "../Utils/CapabilityUtils"; import { isCapabilityEnabled } from "../Utils/CapabilityUtils";
@@ -278,37 +278,6 @@ export default class Explorer {
} }
} }
public openNPSSurveyDialog(): void {
if (!Platform.Portal || !["Postgres", "SQL", "Mongo"].includes(userContext.apiType)) {
return;
}
const ONE_DAY_IN_MS = 86400000;
const SEVEN_DAYS_IN_MS = 604800000;
// Try Cosmos DB subscription - survey shown to 100% of users at day 1 in Data Explorer.
if (userContext.isTryCosmosDBSubscription) {
if (isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", ONE_DAY_IN_MS)) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed in Try Cosmos DB ${userContext.apiType}`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
} else {
// Show survey when an existing account is older than 7 days
if (
!isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", SEVEN_DAYS_IN_MS)
) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed for existing ${userContext.apiType} account older than 7 days`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
}
}
public async openCESCVAFeedbackBlade(): Promise<void> { public async openCESCVAFeedbackBlade(): Promise<void> {
sendMessage({ type: MessageTypes.OpenCESCVAFeedbackBlade }); sendMessage({ type: MessageTypes.OpenCESCVAFeedbackBlade });
Logger.logInfo( Logger.logInfo(
@@ -937,25 +906,28 @@ export default class Explorer {
} }
public async openNotebookTerminal(kind: ViewModels.TerminalKind): Promise<void> { public async openNotebookTerminal(kind: ViewModels.TerminalKind): Promise<void> {
if (useNotebook.getState().isPhoenixFeatures) {
await this.allocateContainer(PoolIdType.DefaultPoolId); if (userContext.features.enableCloudShell || !useNotebook.getState().isPhoenixFeatures) {
const notebookServerInfo = useNotebook.getState().notebookServerInfo; this.connectToTerminal(kind);
if (notebookServerInfo && notebookServerInfo.notebookServerEndpoint !== undefined) { return;
this.connectToNotebookTerminal(kind); }
} else {
useDialog await this.allocateContainer(PoolIdType.DefaultPoolId);
.getState() const notebookServerInfo = useNotebook.getState().notebookServerInfo;
.showOkModalDialog(
"Failed to connect", if (notebookServerInfo?.notebookServerEndpoint) {
"Failed to connect to temporary workspace. This could happen because of network issues. Please refresh the page and try again.", this.connectToTerminal(kind);
);
}
} else { } else {
this.connectToNotebookTerminal(kind); useDialog
.getState()
.showOkModalDialog(
"Failed to connect",
"Failed to connect to temporary workspace. This could happen because of network issues. Please refresh the page and try again."
);
} }
} }
private connectToNotebookTerminal(kind: ViewModels.TerminalKind): void { private connectToTerminal(kind: ViewModels.TerminalKind): void {
let title: string; let title: string;
switch (kind) { switch (kind) {
@@ -1158,7 +1130,7 @@ export default class Explorer {
await this.initNotebooks(userContext.databaseAccount); await this.initNotebooks(userContext.databaseAccount);
} }
await this.refreshSampleData(); this.refreshSampleData();
} }
public async configureCopilot(): Promise<void> { public async configureCopilot(): Promise<void> {
@@ -1183,26 +1155,27 @@ export default class Explorer {
.setCopilotSampleDBEnabled(copilotEnabled && copilotUserDBEnabled && copilotSampleDBEnabled); .setCopilotSampleDBEnabled(copilotEnabled && copilotUserDBEnabled && copilotSampleDBEnabled);
} }
public async refreshSampleData(): Promise<void> { public refreshSampleData(): void {
try { if (!userContext.sampleDataConnectionInfo) {
if (!userContext.sampleDataConnectionInfo) {
return;
}
const collection: DataModels.Collection = await readSampleCollection();
if (!collection) {
return;
}
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
} catch (error) {
Logger.logError(getErrorMessage(error), "Explorer");
return; return;
} }
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
readSampleCollection()
.then((collection: DataModels.Collection) => {
if (!collection) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
})
.catch((error) => {
Logger.logError(getErrorMessage(error), "Explorer/refreshSampleData");
});
} }
} }

View File

@@ -125,13 +125,13 @@ export function createContextCommandBarButtons(
const buttons: CommandButtonComponentProps[] = []; const buttons: CommandButtonComponentProps[] = [];
if (!selectedNodeState.isDatabaseNodeOrNoneSelected() && userContext.apiType === "Mongo") { if (!selectedNodeState.isDatabaseNodeOrNoneSelected() && userContext.apiType === "Mongo") {
const label = useNotebook.getState().isShellEnabled ? "Open Mongo Shell" : "New Shell"; const label = (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) ? "Open Mongo Shell" : "New Shell";
const newMongoShellBtn: CommandButtonComponentProps = { const newMongoShellBtn: CommandButtonComponentProps = {
iconSrc: HostedTerminalIcon, iconSrc: HostedTerminalIcon,
iconAlt: label, iconAlt: label,
onCommandClick: () => { onCommandClick: () => {
const selectedCollection: ViewModels.Collection = selectedNodeState.findSelectedCollection(); const selectedCollection: ViewModels.Collection = selectedNodeState.findSelectedCollection();
if (useNotebook.getState().isShellEnabled) { if (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) {
container.openNotebookTerminal(ViewModels.TerminalKind.Mongo); container.openNotebookTerminal(ViewModels.TerminalKind.Mongo);
} else { } else {
selectedCollection && selectedCollection.onNewMongoShellClick(); selectedCollection && selectedCollection.onNewMongoShellClick();
@@ -145,7 +145,7 @@ export function createContextCommandBarButtons(
} }
if ( if (
useNotebook.getState().isShellEnabled && (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) &&
!selectedNodeState.isDatabaseNodeOrNoneSelected() && !selectedNodeState.isDatabaseNodeOrNoneSelected() &&
userContext.apiType === "Cassandra" userContext.apiType === "Cassandra"
) { ) {

View File

@@ -1,6 +1,5 @@
import { isPublicInternetAccessAllowed } from "Common/DatabaseAccountUtility"; import { isPublicInternetAccessAllowed } from "Common/DatabaseAccountUtility";
import { PhoenixClient } from "Phoenix/PhoenixClient"; import { PhoenixClient } from "Phoenix/PhoenixClient";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { cloneDeep } from "lodash"; import { cloneDeep } from "lodash";
import create, { UseStore } from "zustand"; import create, { UseStore } from "zustand";
import { AuthType } from "../../AuthType"; import { AuthType } from "../../AuthType";
@@ -128,9 +127,7 @@ export const useNotebook: UseStore<NotebookState> = create((set, get) => ({
userContext.apiType === "Postgres" || userContext.apiType === "VCoreMongo" userContext.apiType === "Postgres" || userContext.apiType === "VCoreMongo"
? databaseAccount?.location ? databaseAccount?.location
: databaseAccount?.properties?.writeLocations?.[0]?.locationName.toLowerCase(); : databaseAccount?.properties?.writeLocations?.[0]?.locationName.toLowerCase();
const disallowedLocationsUri: string = useNewPortalBackendEndpoint(Constants.BackendApi.DisallowedLocations) const disallowedLocationsUri: string = `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`;
? `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`
: `${configContext.BACKEND_ENDPOINT}/api/disallowedLocations`;
const authorizationHeader = getAuthorizationHeader(); const authorizationHeader = getAuthorizationHeader();
try { try {
const response = await fetch(disallowedLocationsUri, { const response = await fetch(disallowedLocationsUri, {

View File

@@ -865,6 +865,7 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
<Link <Link
href="https://aka.ms/cosmosdb-synapselink" href="https://aka.ms/cosmosdb-synapselink"
target="_blank" target="_blank"
aria-label={Constants.ariaLabelForLearnMoreLink.AzureSynapseLink}
className="capacitycalculator-link" className="capacitycalculator-link"
> >
Learn more Learn more
@@ -1222,7 +1223,11 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
<Text variant="small"> <Text variant="small">
Enable analytical store capability to perform near real-time analytics on your operational data, without Enable analytical store capability to perform near real-time analytics on your operational data, without
impacting the performance of transactional workloads.{" "} impacting the performance of transactional workloads.{" "}
<Link target="_blank" href="https://aka.ms/analytical-store-overview"> <Link
aria-label={Constants.ariaLabelForLearnMoreLink.AnalyticalStore}
target="_blank"
href="https://aka.ms/analytical-store-overview"
>
Learn more Learn more
</Link> </Link>
</Text> </Text>

View File

@@ -174,15 +174,26 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
const styles = useStyles(); const styles = useStyles();
const explorerVersion = configContext.gitSha; const explorerVersion = configContext.gitSha;
const isEmulator = configContext.platform === Platform.Emulator;
const shouldShowQueryPageOptions = userContext.apiType === "SQL"; const shouldShowQueryPageOptions = userContext.apiType === "SQL";
const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin"; const showRetrySettings =
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin"; (userContext.apiType === "SQL" || userContext.apiType === "Tables" || userContext.apiType === "Gremlin") &&
const shouldShowParallelismOption = userContext.apiType !== "Gremlin"; !isEmulator;
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled(); const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin" && !isEmulator;
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin" && !isEmulator;
const shouldShowParallelismOption = userContext.apiType !== "Gremlin" && !isEmulator;
const showEnableEntraIdRbac =
userContext.apiType === "SQL" &&
userContext.authType === AuthType.AAD &&
configContext.platform !== Platform.Fabric &&
!isEmulator;
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled() && !isEmulator;
const shouldShowCopilotSampleDBOption = const shouldShowCopilotSampleDBOption =
userContext.apiType === "SQL" && userContext.apiType === "SQL" &&
useQueryCopilot.getState().copilotEnabled && useQueryCopilot.getState().copilotEnabled &&
useDatabases.getState().sampleDataResourceTokenCollection; useDatabases.getState().sampleDataResourceTokenCollection &&
!isEmulator;
const handlerOnSubmit = async () => { const handlerOnSubmit = async () => {
setIsExecuting(true); setIsExecuting(true);
@@ -491,7 +502,7 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
return ( return (
<RightPaneForm {...genericPaneProps}> <RightPaneForm {...genericPaneProps}>
<div className={`paneMainContent ${styles.container}`}> <div className={`paneMainContent ${styles.container}`}>
<Accordion className={styles.firstItem}> <Accordion className={`customAccordion ${styles.firstItem}`}>
{shouldShowQueryPageOptions && ( {shouldShowQueryPageOptions && (
<AccordionItem value="1"> <AccordionItem value="1">
<AccordionHeader> <AccordionHeader>
@@ -541,39 +552,37 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel> </AccordionPanel>
</AccordionItem> </AccordionItem>
)} )}
{userContext.apiType === "SQL" && {showEnableEntraIdRbac && (
userContext.authType === AuthType.AAD && <AccordionItem value="2">
configContext.platform !== Platform.Fabric && ( <AccordionHeader>
<AccordionItem value="2"> <div className={styles.header}>Enable Entra ID RBAC</div>
<AccordionHeader> </AccordionHeader>
<div className={styles.header}>Enable Entra ID RBAC</div> <AccordionPanel>
</AccordionHeader> <div className={styles.settingsSectionContainer}>
<AccordionPanel> <div className={styles.settingsSectionDescription}>
<div className={styles.settingsSectionContainer}> Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra ID
<div className={styles.settingsSectionDescription}> RBAC.
Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra <a
ID RBAC. href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer"
<a target="_blank"
href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer" rel="noopener noreferrer"
target="_blank" >
rel="noopener noreferrer" {" "}
> Learn more{" "}
{" "} </a>
Learn more{" "}
</a>
</div>
<ChoiceGroup
ariaLabelledBy="enableDataPlaneRBACOptions"
options={dataPlaneRBACOptionsList}
styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
</div> </div>
</AccordionPanel> <ChoiceGroup
</AccordionItem> ariaLabelledBy="enableDataPlaneRBACOptions"
)} options={dataPlaneRBACOptionsList}
{userContext.apiType === "SQL" && ( styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
</div>
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" && !isEmulator && (
<> <>
<AccordionItem value="3"> <AccordionItem value="3">
<AccordionHeader> <AccordionHeader>
@@ -671,7 +680,7 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionItem> </AccordionItem>
</> </>
)} )}
{(userContext.apiType === "SQL" || userContext.apiType === "Tables" || userContext.apiType === "Gremlin") && ( {showRetrySettings && (
<AccordionItem value="6"> <AccordionItem value="6">
<AccordionHeader> <AccordionHeader>
<div className={styles.header}>Retry Settings</div> <div className={styles.header}>Retry Settings</div>
@@ -744,29 +753,30 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel> </AccordionPanel>
</AccordionItem> </AccordionItem>
)} )}
{!isEmulator && (
<AccordionItem value="7"> <AccordionItem value="7">
<AccordionHeader> <AccordionHeader>
<div className={styles.header}>Enable container pagination</div> <div className={styles.header}>Enable container pagination</div>
</AccordionHeader> </AccordionHeader>
<AccordionPanel> <AccordionPanel>
<div className={styles.settingsSectionContainer}> <div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}> <div className={styles.settingsSectionDescription}>
Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order. Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order.
</div>
<Checkbox
styles={{
label: { padding: 0 },
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div> </div>
<Checkbox </AccordionPanel>
styles={{ </AccordionItem>
label: { padding: 0 }, )}
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div>
</AccordionPanel>
</AccordionItem>
{shouldShowCrossPartitionOption && ( {shouldShowCrossPartitionOption && (
<AccordionItem value="8"> <AccordionItem value="8">
<AccordionHeader> <AccordionHeader>

View File

@@ -11,7 +11,7 @@ exports[`Settings Pane should render Default properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl" className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
> >
<Accordion <Accordion
className="___1uf6361_0000000 fz7g6wx" className="customAccordion ___1uf6361_0000000 fz7g6wx"
> >
<AccordionItem <AccordionItem
value="1" value="1"
@@ -572,7 +572,7 @@ exports[`Settings Pane should render Gremlin properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl" className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
> >
<Accordion <Accordion
className="___1uf6361_0000000 fz7g6wx" className="customAccordion ___1uf6361_0000000 fz7g6wx"
> >
<AccordionItem <AccordionItem
value="6" value="6"

View File

@@ -319,6 +319,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads. Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads.
<StyledLinkBase <StyledLinkBase
aria-label="Learn more about analytical store."
href="https://aka.ms/analytical-store-overview" href="https://aka.ms/analytical-store-overview"
target="_blank" target="_blank"
> >
@@ -383,6 +384,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
. Enable Synapse Link for this Cosmos DB account. . Enable Synapse Link for this Cosmos DB account.
<StyledLinkBase <StyledLinkBase
aria-label="Learn more about Azure Synapse Link."
className="capacitycalculator-link" className="capacitycalculator-link"
href="https://aka.ms/cosmosdb-synapselink" href="https://aka.ms/cosmosdb-synapselink"
target="_blank" target="_blank"

View File

@@ -75,6 +75,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
const inputEdited = useRef(false); const inputEdited = useRef(false);
const itemRefs = useRef([]); const itemRefs = useRef([]);
const searchInputRef = useRef(null); const searchInputRef = useRef(null);
const copyQueryRef = useRef(null);
const { const {
openFeedbackModal, openFeedbackModal,
hideFeedbackModalForLikedQueries, hideFeedbackModalForLikedQueries,
@@ -132,6 +133,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
document.body.removeChild(queryElement); document.body.removeChild(queryElement);
setshowCopyPopup(true); setshowCopyPopup(true);
copyQueryRef.current.focus();
setTimeout(() => { setTimeout(() => {
setshowCopyPopup(false); setshowCopyPopup(false);
}, 6000); }, 6000);
@@ -305,7 +307,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
if (isGeneratingQuery === null) { if (isGeneratingQuery === null) {
return " "; return " ";
} else if (isGeneratingQuery) { } else if (isGeneratingQuery) {
return "Content is loading"; return "Thinking";
} else { } else {
return "Content is updated"; return "Content is updated";
} }
@@ -400,6 +402,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
<IconButton <IconButton
iconProps={{ iconName: "Send" }} iconProps={{ iconName: "Send" }}
disabled={isGeneratingQuery || !userPrompt.trim()} disabled={isGeneratingQuery || !userPrompt.trim()}
allowDisabledFocus={true}
style={{ background: "none" }} style={{ background: "none" }}
onClick={() => startGenerateQueryProcess()} onClick={() => startGenerateQueryProcess()}
aria-label="Send" aria-label="Send"
@@ -676,6 +679,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)} )}
<CommandBarButton <CommandBarButton
className="copyQuery" className="copyQuery"
elementRef={copyQueryRef}
onClick={copyGeneratedCode} onClick={copyGeneratedCode}
iconProps={{ iconName: "Copy" }} iconProps={{ iconName: "Copy" }}
style={{ fontSize: 12, transition: "background-color 0.3s ease", height: "100%" }} style={{ fontSize: 12, transition: "background-color 0.3s ease", height: "100%" }}
@@ -706,6 +710,9 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)} )}
</Stack> </Stack>
)} )}
{(showFeedbackBar || isGeneratingQuery) && (
<span role="alert" className="screenReaderOnly" aria-label={getAriaLabel()} />
)}
{isGeneratingQuery && ( {isGeneratingQuery && (
<ProgressIndicator <ProgressIndicator
label="Thinking..." label="Thinking..."

View File

@@ -1,7 +1,6 @@
import { FeedOptions } from "@azure/cosmos"; import { FeedOptions } from "@azure/cosmos";
import { import {
Areas, Areas,
BackendApi,
ConnectionStatusType, ConnectionStatusType,
ContainerStatusType, ContainerStatusType,
HttpStatusCodes, HttpStatusCodes,
@@ -32,7 +31,6 @@ import { Action } from "Shared/Telemetry/TelemetryConstants";
import { traceFailure, traceStart, traceSuccess } from "Shared/Telemetry/TelemetryProcessor"; import { traceFailure, traceStart, traceSuccess } from "Shared/Telemetry/TelemetryProcessor";
import { userContext } from "UserContext"; import { userContext } from "UserContext";
import { getAuthorizationHeader } from "Utils/AuthorizationUtils"; import { getAuthorizationHeader } from "Utils/AuthorizationUtils";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { queryPagesUntilContentPresent } from "Utils/QueryUtils"; import { queryPagesUntilContentPresent } from "Utils/QueryUtils";
import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot"; import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot";
import { useTabs } from "hooks/useTabs"; import { useTabs } from "hooks/useTabs";
@@ -82,9 +80,7 @@ export const isCopilotFeatureRegistered = async (subscriptionId: string): Promis
}; };
export const getCopilotEnabled = async (): Promise<boolean> => { export const getCopilotEnabled = async (): Promise<boolean> => {
const backendEndpoint: string = useNewPortalBackendEndpoint(BackendApi.PortalSettings) const backendEndpoint: string = configContext.PORTAL_BACKEND_ENDPOINT;
? configContext.PORTAL_BACKEND_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const url = `${backendEndpoint}/api/portalsettings/querycopilot`; const url = `${backendEndpoint}/api/portalsettings/querycopilot`;
const authorizationHeader: AuthorizationTokenHeaderMetadata = getAuthorizationHeader(); const authorizationHeader: AuthorizationTokenHeaderMetadata = getAuthorizationHeader();

View File

@@ -26,7 +26,7 @@ import { getCollectionName, getDatabaseName } from "Utils/APITypeUtils";
import { Allotment, AllotmentHandle } from "allotment"; import { Allotment, AllotmentHandle } from "allotment";
import { useSidePanel } from "hooks/useSidePanel"; import { useSidePanel } from "hooks/useSidePanel";
import { debounce } from "lodash"; import { debounce } from "lodash";
import React, { useCallback, useEffect, useMemo, useRef, useState } from "react"; import React, { useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState } from "react";
const useSidebarStyles = makeStyles({ const useSidebarStyles = makeStyles({
sidebar: { sidebar: {
@@ -109,6 +109,7 @@ interface GlobalCommand {
icon: JSX.Element; icon: JSX.Element;
onClick: () => void; onClick: () => void;
keyboardAction?: KeyboardAction; keyboardAction?: KeyboardAction;
ref?: React.RefObject<HTMLButtonElement>;
} }
const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => { const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
@@ -118,6 +119,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
// However, that messes with the Menu positioning, so we need to get a reference to the 'div' to pass to the Menu. // However, that messes with the Menu positioning, so we need to get a reference to the 'div' to pass to the Menu.
// We can't use a ref though, because it would be set after the Menu is rendered, so we use a state value to force a re-render. // We can't use a ref though, because it would be set after the Menu is rendered, so we use a state value to force a re-render.
const [globalCommandButton, setGlobalCommandButton] = useState<HTMLElement | null>(null); const [globalCommandButton, setGlobalCommandButton] = useState<HTMLElement | null>(null);
const primaryFocusableRef = useRef<HTMLButtonElement>(null);
const actions = useMemo<GlobalCommand[]>(() => { const actions = useMemo<GlobalCommand[]>(() => {
if ( if (
@@ -177,6 +179,16 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
); );
}, [actions, setKeyboardActions]); }, [actions, setKeyboardActions]);
useLayoutEffect(() => {
if (primaryFocusableRef.current) {
const timer = setTimeout(() => {
primaryFocusableRef.current.focus();
}, 0);
return () => clearTimeout(timer);
}
return undefined;
}, []);
if (!primaryAction) { if (!primaryAction) {
return null; return null;
} }
@@ -184,7 +196,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
return ( return (
<div className={styles.globalCommandsContainer} data-test="GlobalCommands"> <div className={styles.globalCommandsContainer} data-test="GlobalCommands">
{actions.length === 1 ? ( {actions.length === 1 ? (
<Button icon={primaryAction.icon} onClick={onPrimaryActionClick}> <Button icon={primaryAction.icon} onClick={onPrimaryActionClick} ref={primaryFocusableRef}>
{primaryAction.label} {primaryAction.label}
</Button> </Button>
) : ( ) : (
@@ -194,7 +206,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
<div ref={setGlobalCommandButton}> <div ref={setGlobalCommandButton}>
<SplitButton <SplitButton
menuButton={{ ...triggerProps, "aria-label": "More commands" }} menuButton={{ ...triggerProps, "aria-label": "More commands" }}
primaryActionButton={{ onClick: onPrimaryActionClick }} primaryActionButton={{ onClick: onPrimaryActionClick, ref: primaryFocusableRef }}
className={styles.globalCommandsSplitButton} className={styles.globalCommandsSplitButton}
icon={primaryAction.icon} icon={primaryAction.icon}
> >

View File

@@ -39,7 +39,7 @@ export const SplashScreenButton: React.FC<SplashScreenButtonProps> = ({
role="button" role="button"
> >
<div> <div>
<img src={imgSrc} /> <img src={imgSrc} alt={title} aria-hidden="true" />
</div> </div>
<Stack style={{ marginLeft: 16 }}> <Stack style={{ marginLeft: 16 }}>
<Text style={{ fontSize: 18, fontWeight: 600 }}>{title}</Text> <Text style={{ fontSize: 18, fontWeight: 600 }}>{title}</Text>

View File

@@ -3,7 +3,7 @@ import * as ko from "knockout";
import Q from "q"; import Q from "q";
import { AuthType } from "../../AuthType"; import { AuthType } from "../../AuthType";
import * as Constants from "../../Common/Constants"; import * as Constants from "../../Common/Constants";
import { CassandraProxyAPIs, CassandraProxyEndpoints } from "../../Common/Constants"; import { CassandraProxyAPIs } from "../../Common/Constants";
import { handleError } from "../../Common/ErrorHandlingUtils"; import { handleError } from "../../Common/ErrorHandlingUtils";
import * as HeadersUtility from "../../Common/HeadersUtility"; import * as HeadersUtility from "../../Common/HeadersUtility";
import { createDocument } from "../../Common/dataAccess/createDocument"; import { createDocument } from "../../Common/dataAccess/createDocument";
@@ -264,9 +264,6 @@ export class CassandraAPIDataClient extends TableDataClient {
shouldNotify?: boolean, shouldNotify?: boolean,
paginationToken?: string, paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> { ): Promise<Entities.IListTableEntitiesResult> {
if (!this.useCassandraProxyEndpoint("postQuery")) {
return this.queryDocuments_ToBeDeprecated(collection, query, shouldNotify, paginationToken);
}
const clearMessage = const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`); shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try { try {
@@ -309,55 +306,6 @@ export class CassandraAPIDataClient extends TableDataClient {
} }
} }
public async queryDocuments_ToBeDeprecated(
collection: ViewModels.Collection,
query: string,
shouldNotify?: boolean,
paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> {
const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try {
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestQueryApi
: Constants.CassandraBackend.queryApi;
const data: any = await $.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
query,
paginationToken,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
});
shouldNotify &&
NotificationConsoleUtils.logConsoleInfo(
`Successfully fetched ${data.result.length} rows for table ${collection.id()}`,
);
return {
Results: data.result,
ContinuationToken: data.paginationToken,
};
} catch (error) {
shouldNotify &&
handleError(
error,
"QueryDocuments_ToBeDeprecated_Cassandra",
`Failed to query rows for table ${collection.id()}`,
);
throw error;
} finally {
clearMessage?.();
}
}
public async deleteDocuments( public async deleteDocuments(
collection: ViewModels.Collection, collection: ViewModels.Collection,
entitiesToDelete: Entities.ITableEntity[], entitiesToDelete: Entities.ITableEntity[],
@@ -471,10 +419,6 @@ export class CassandraAPIDataClient extends TableDataClient {
} }
public getTableKeys(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> { public getTableKeys(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!this.useCassandraProxyEndpoint("getKeys")) {
return this.getTableKeys_ToBeDeprecated(collection);
}
if (!!collection.cassandraKeys) { if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys); return Q.resolve(collection.cassandraKeys);
} }
@@ -515,52 +459,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise; return deferred.promise;
} }
public getTableKeys_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys);
}
const clearInProgressMessage = logConsoleProgress(`Fetching keys for table ${collection.id()}`);
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestKeysApi
: Constants.CassandraBackend.keysApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKeys>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: CassandraTableKeys) => {
collection.cassandraKeys = data;
logConsoleInfo(`Successfully fetched keys for table ${collection.id()}`);
deferred.resolve(data);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchKeysCassandra", `Error fetching keys for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
public getTableSchema(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> { public getTableSchema(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!this.useCassandraProxyEndpoint("getSchema")) {
return this.getTableSchema_ToBeDeprecated(collection);
}
if (!!collection.cassandraSchema) { if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema); return Q.resolve(collection.cassandraSchema);
} }
@@ -602,52 +501,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise; return deferred.promise;
} }
public getTableSchema_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema);
}
const clearInProgressMessage = logConsoleProgress(`Fetching schema for table ${collection.id()}`);
const { databaseAccount, authType } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestSchemaApi
: Constants.CassandraBackend.schemaApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKey[]>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: any) => {
collection.cassandraSchema = data.columns;
logConsoleInfo(`Successfully fetched schema for table ${collection.id()}`);
deferred.resolve(data.columns);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchSchemaCassandra", `Error fetching schema for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
private createOrDeleteQuery(cassandraEndpoint: string, resourceId: string, query: string): Q.Promise<any> { private createOrDeleteQuery(cassandraEndpoint: string, resourceId: string, query: string): Q.Promise<any> {
if (!this.useCassandraProxyEndpoint("createOrDelete")) {
return this.createOrDeleteQuery_ToBeDeprecated(cassandraEndpoint, resourceId, query);
}
const deferred = Q.defer(); const deferred = Q.defer();
const { authType, databaseAccount } = userContext; const { authType, databaseAccount } = userContext;
const apiEndpoint: string = const apiEndpoint: string =
@@ -677,38 +531,6 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise; return deferred.promise;
} }
private createOrDeleteQuery_ToBeDeprecated(
cassandraEndpoint: string,
resourceId: string,
query: string,
): Q.Promise<any> {
const deferred = Q.defer();
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestCreateOrDeleteApi
: Constants.CassandraBackend.createOrDeleteApi;
$.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(cassandraEndpoint),
resourceId: resourceId,
query: query,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
}).then(
(data: any) => {
deferred.resolve();
},
(reason) => {
deferred.reject(reason);
},
);
return deferred.promise;
}
private trimCassandraEndpoint(cassandraEndpoint: string): string { private trimCassandraEndpoint(cassandraEndpoint: string): string {
if (!cassandraEndpoint) { if (!cassandraEndpoint) {
return cassandraEndpoint; return cassandraEndpoint;
@@ -747,23 +569,4 @@ export class CassandraAPIDataClient extends TableDataClient {
private getCassandraPartitionKeyProperty(collection: ViewModels.Collection): string { private getCassandraPartitionKeyProperty(collection: ViewModels.Collection): string {
return collection.cassandraKeys.partitionKeys[0].property; return collection.cassandraKeys.partitionKeys[0].property;
} }
private useCassandraProxyEndpoint(api: string): boolean {
const activeCassandraProxyEndpoints: string[] = [
CassandraProxyEndpoints.Development,
CassandraProxyEndpoints.Mpac,
CassandraProxyEndpoints.Prod,
CassandraProxyEndpoints.Fairfax,
CassandraProxyEndpoints.Mooncake,
];
if (configContext.globallyEnabledCassandraAPIs.includes(api)) {
return true;
}
return (
configContext.NEW_CASSANDRA_APIS?.includes(api) &&
activeCassandraProxyEndpoints.includes(configContext.CASSANDRA_PROXY_ENDPOINT)
);
}
} }

View File

@@ -0,0 +1,126 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { IDisposable, ITerminalAddon, Terminal } from 'xterm';
interface IAttachOptions {
bidirectional?: boolean;
}
export class AttachAddon implements ITerminalAddon {
private _socket: WebSocket;
private _bidirectional: boolean;
private _disposables: IDisposable[] = [];
private _socketData: string;
constructor(socket: WebSocket, options?: IAttachOptions) {
this._socket = socket;
// always set binary type to arraybuffer, we do not handle blobs
this._socket.binaryType = 'arraybuffer';
this._bidirectional = !(options && options.bidirectional === false);
this._socketData = '';
}
public activate(terminal: Terminal): void {
this._disposables.push(
addSocketListener(this._socket, 'message', ev => {
let data: ArrayBuffer | string = ev.data;
const startStatusJson = 'ie_us';
const endStatusJson = 'ie_ue';
if (typeof data === 'object') {
const enc = new TextDecoder("utf-8");
data = enc.decode(ev.data as any);
}
// for example of json object look in TerminalHelper in the socket.onMessage
if (data.includes(startStatusJson) && data.includes(endStatusJson)) {
// process as one line
const statusData = data.split(startStatusJson)[1].split(endStatusJson)[0];
data = data.replace(statusData, '');
data = data.replace(startStatusJson, '');
data = data.replace(endStatusJson, '');
} else if (data.includes(startStatusJson)) {
// check for start
const partialStatusData = data.split(startStatusJson)[1];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(startStatusJson, '');
} else if (data.includes(endStatusJson)) {
// check for end and process the command
const partialStatusData = data.split(endStatusJson)[0];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(endStatusJson, '');
this._socketData = '';
} else if (this._socketData.length > 0) {
// check if the line is all data then just concatenate
this._socketData += data;
data = '';
}
terminal.write(data);
})
);
if (this._bidirectional) {
this._disposables.push(terminal.onData(data => this._sendData(data)));
this._disposables.push(terminal.onBinary(data => this._sendBinary(data)));
}
this._disposables.push(addSocketListener(this._socket, 'close', () => this.dispose()));
this._disposables.push(addSocketListener(this._socket, 'error', () => this.dispose()));
}
public dispose(): void {
for (const d of this._disposables) {
d.dispose();
}
}
private _sendData(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
this._socket.send(data);
}
private _sendBinary(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
const buffer = new Uint8Array(data.length);
for (let i = 0; i < data.length; ++i) {
buffer[i] = data.charCodeAt(i) & 255;
}
this._socket.send(buffer);
}
private _checkOpenSocket(): boolean {
switch (this._socket.readyState) {
case WebSocket.OPEN:
return true;
case WebSocket.CONNECTING:
throw new Error('Attach addon was loaded before socket was open');
case WebSocket.CLOSING:
return false;
case WebSocket.CLOSED:
throw new Error('Attach addon socket is closed');
default:
throw new Error('Unexpected socket state');
}
}
}
function addSocketListener<K extends keyof WebSocketEventMap>(socket: WebSocket, type: K, handler: (this: WebSocket, ev: WebSocketEventMap[K]) => any): IDisposable {
socket.addEventListener(type, handler);
return {
dispose: () => {
if (!handler) {
// Already disposed
return;
}
socket.removeEventListener(type, handler);
}
};
}

View File

@@ -0,0 +1,76 @@
import React, { useEffect, useRef } from "react";
import { Terminal } from "xterm";
import { FitAddon } from 'xterm-addon-fit';
import "xterm/css/xterm.css";
import { TerminalKind } from "../../../Contracts/ViewModels";
import { startCloudShellTerminal } from "./Core/CloudShellTerminalCore";
export interface CloudShellTerminalProps {
shellType: TerminalKind;
}
export const CloudShellTerminalComponent: React.FC<CloudShellTerminalProps> = ({
shellType
}: CloudShellTerminalProps) => {
const terminalRef = useRef(null); // Reference for terminal container
const xtermRef = useRef(null); // Reference for XTerm instance
const socketRef = useRef(null); // Reference for WebSocket
const fitAddon = new FitAddon();
useEffect(() => {
// Initialize XTerm instance
const term = new Terminal({
cursorBlink: true,
cursorStyle: 'bar',
fontFamily: 'monospace',
fontSize: 14,
theme: {
background: "#1e1e1e",
foreground: "#d4d4d4",
cursor: "#ffcc00"
},
scrollback: 1000
});
term.loadAddon(fitAddon);
// Attach terminal to the DOM
if (terminalRef.current) {
term.open(terminalRef.current);
xtermRef.current = term;
}
if (fitAddon) {
fitAddon.fit();
}
// Adjust terminal size on window resize
const handleResize = () => fitAddon.fit();
window.addEventListener('resize', handleResize);
try {
socketRef.current = startCloudShellTerminal(term, shellType);
term.onData((data) => {
if (socketRef.current && socketRef.current.readyState === WebSocket.OPEN) {
socketRef.current.send(data);
}
});
} catch (error) {
console.error("Failed to initialize CloudShell terminal:", error);
term.writeln(`\x1B[31mError initializing terminal: ${error.message}\x1B[0m`);
}
// Cleanup function to close WebSocket and dispose terminal
return () => {
if (!socketRef.current) return;
if (socketRef.current) {
socketRef.current.close(); // Close WebSocket connection
}
window.removeEventListener('resize', handleResize);
term.dispose(); // Clean up XTerm instance
};
}, [shellType]);
return <div ref={terminalRef} style={{ width: "100%", height: "500px"}} />;
};

View File

@@ -0,0 +1,152 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { TerminalKind } from "../../../Contracts/ViewModels";
import { userContext } from "../../../UserContext";
export const getCommands = (terminalKind: TerminalKind, key: string) => {
let dbAccount = userContext.databaseAccount;
let endpoint;
switch (terminalKind) {
case TerminalKind.Postgres:
endpoint = dbAccount.properties.postgresqlEndpoint;
break;
case TerminalKind.Mongo:
endpoint = dbAccount.properties.mongoEndpoint;
break;
case TerminalKind.VCoreMongo:
endpoint = dbAccount.properties.vcoreMongoEndpoint;
break;
case TerminalKind.Cassandra:
endpoint = dbAccount.properties.cassandraEndpoint;
break;
default:
throw new Error("Unknown Terminal Kind");
}
let config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return commands(terminalKind, config).join("\n").concat("\n");
};
export interface CommandConfig {
host: string,
name: string,
password: string,
endpoint: string
}
export const commands = (terminalKind: TerminalKind, config?: CommandConfig): string[] => {
switch (terminalKind) {
case TerminalKind.Postgres:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if psql is installed; if not, proceed with installation
"if ! command -v psql &> /dev/null; then echo '⚠️ psql not found. Installing...'; fi",
// 3. Download PostgreSQL if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.postgresql.org/pub/source/v15.2/postgresql-15.2.tar.bz2; fi",
// 4. Extract PostgreSQL package if not installed
"if ! command -v psql &> /dev/null; then tar -xvjf postgresql-15.2.tar.bz2; fi",
// 5. Create a directory for PostgreSQL installation if not installed
"if ! command -v psql &> /dev/null; then mkdir -p ~/pgsql; fi",
// 6. Download readline (dependency for PostgreSQL) if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.gnu.org/gnu/readline/readline-8.1.tar.gz; fi",
// 7. Extract readline package if not installed
"if ! command -v psql &> /dev/null; then tar -xvzf readline-8.1.tar.gz; fi",
// 8. Configure readline if not installed
"if ! command -v psql &> /dev/null; then cd readline-8.1 && ./configure --prefix=$HOME/pgsql; fi",
// 9. Add PostgreSQL to PATH if not installed
"if ! command -v psql &> /dev/null; then echo 'export PATH=$HOME/pgsql/bin:$PATH' >> ~/.bashrc; fi",
// 10. Source .bashrc to update PATH (even if psql was already installed)
"source ~/.bashrc",
// 11. Verify PostgreSQL installation
"psql --version",
`psql 'read -p "Enter Database Name: " dbname && read -p "Enter Username: " username && host=${config.endpoint} port=5432 dbname=$dbname user=$username sslmode=require'`
];
case TerminalKind.Mongo:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`mongosh --host ${config.host} --port 10255 --username ${config.name} --password ${config.password} --tls --tlsAllowInvalidCertificates`
];
case TerminalKind.VCoreMongo:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 10. Login to MongoDBmongosh mongodb+srv://<credentials>@neesharma-stage-mongo-vcore.mongocluster.cosmos.azure.com/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000\u0007
`read -p "Enter username: " username && mongosh "mongodb+srv://$username:@${config.endpoint}/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000" --tls --tlsAllowInvalidCertificates`
];
case TerminalKind.Cassandra:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if cqlsh is installed; if not, proceed with installation
"if ! command -v cqlsh &> /dev/null; then echo '⚠️ cqlsh not found. Installing...'; fi",
// 3. Download Cassandra if not installed
"if ! command -v cqlsh &> /dev/null; then curl -LO https://archive.apache.org/dist/cassandra/5.0.3/apache-cassandra-5.0.3-bin.tar.gz; fi",
// 4. Extract Cassandra package if not installed
"if ! command -v cqlsh &> /dev/null; then tar -xvzf apache-cassandra-5.0.3-bin.tar.gz; fi",
// 5. Move Cassandra binaries if not installed
"if ! command -v cqlsh &> /dev/null; then mkdir -p ~/cassandra && mv apache-cassandra-5.0.3/* ~/cassandra/; fi",
// 6. Add Cassandra to PATH if not installed
"if ! command -v cqlsh &> /dev/null; then echo 'export PATH=$HOME/cassandra/bin:$PATH' >> ~/.bashrc; fi",
// 7. Set environment variables for SSL
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VERSION=TLSv1_2' >> ~/.bashrc; fi",
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VALIDATE=false' >> ~/.bashrc; fi",
// 8. Source .bashrc to update PATH (even if cqlsh was already installed)
"source ~/.bashrc",
// 9. Verify cqlsh installation
"cqlsh --version",
// 10. Login to Cassandra
`cqlsh ${config.host} 10350 -u ${config.name} -p ${config.password} --ssl --protocol-version=4`
];
default:
return ["echo Unknown Shell"];
}
}
const getHostFromUrl = (mongoEndpoint: string): string => {
try {
const url = new URL(mongoEndpoint);
return url.hostname;
} catch (error) {
console.error("Invalid Mongo Endpoint URL:", error);
return "";
}
};

View File

@@ -0,0 +1,393 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Core functionality for CloudShell terminal management
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import {
authorizeSession,
connectTerminal,
provisionConsole,
putEphemeralUserSettings,
registerCloudShellProvider,
verifyCloudShellProviderRegistration
} from "../Data/CloudShellApiClient";
import { getNormalizedRegion } from "../Data/RegionUtils";
import { ShellTypeHandler } from "../ShellTypes/ShellTypeFactory";
import { AttachAddon } from "../Utils/AttachAddOn";
import { wait } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
// Constants
const DEFAULT_CLOUDSHELL_REGION = "westus";
const POLLING_INTERVAL_MS = 5000;
const MAX_RETRY_COUNT = 10;
const MAX_PING_COUNT = 20 * 60; // 20 minutes (60 seconds/minute)
/**
* Main function to start a CloudShell terminal
*/
export const startCloudShellTerminal = async (terminal: Terminal, shellType: TerminalKind) => {
// Get the shell handler for this type
const shellHandler = ShellTypeHandler.getHandler(shellType);
terminal.writeln(terminalLog.header("Initializing Azure CloudShell"));
await ensureCloudShellProviderRegistered(terminal);
const { resolvedRegion, defaultCloudShellRegion } = determineCloudShellRegion(terminal);
// Ask for user consent for region
const consentGranted = await askForRegionConsent(terminal, resolvedRegion);
if (!consentGranted) {
return {}; // Exit if user declined
}
// Check network requirements for this shell type
const networkConfig = await shellHandler.configureNetworkAccess(terminal, resolvedRegion);
terminal.writeln("");
// Provision CloudShell session
terminal.writeln(terminalLog.cloudshell(`Provisioning Started....`));
let sessionDetails: {
socketUri?: string;
provisionConsoleResponse?: any;
targetUri?: string;
};
try {
sessionDetails = await provisionCloudShellSession(resolvedRegion, terminal, networkConfig.vNetSettings, networkConfig.isAllPublicAccessEnabled);
} catch (err) {
terminal.writeln(terminalLog.error(err));
terminal.writeln(terminalLog.error("Failed to provision in primary region"));
terminal.writeln(terminalLog.warning(`Attempting with fallback region: ${defaultCloudShellRegion}`));
sessionDetails = await provisionCloudShellSession(defaultCloudShellRegion, terminal, networkConfig.vNetSettings, networkConfig.isAllPublicAccessEnabled);
}
if (!sessionDetails.socketUri) {
terminal.writeln(terminalLog.error('Unable to provision console. Please try again later.'));
return {};
}
// Configure WebSocket connection with shell-specific commands
const socket = await establishTerminalConnection(
terminal,
shellHandler,
sessionDetails.socketUri,
sessionDetails.provisionConsoleResponse,
sessionDetails.targetUri
);
return socket;
};
/**
* Ensures that the CloudShell provider is registered for the current subscription
*/
export const ensureCloudShellProviderRegistered = async (terminal: Terminal): Promise<void> => {
try {
terminal.writeln(terminalLog.info("Verifying provider registration..."));
const response: any = await verifyCloudShellProviderRegistration(userContext.subscriptionId);
if (response.registrationState !== "Registered") {
terminal.writeln(terminalLog.warning("Registering CloudShell provider..."));
await registerCloudShellProvider(userContext.subscriptionId);
terminal.writeln(terminalLog.success("Provider registration successful"));
}
} catch (err) {
terminal.writeln(terminalLog.error("Unable to verify provider registration"));
throw err;
}
};
/**
* Determines the appropriate CloudShell region
*/
export const determineCloudShellRegion = (terminal: Terminal): { resolvedRegion: string; defaultCloudShellRegion: string } => {
const region = userContext.databaseAccount?.location;
const resolvedRegion = getNormalizedRegion(region, DEFAULT_CLOUDSHELL_REGION);
return { resolvedRegion, defaultCloudShellRegion: DEFAULT_CLOUDSHELL_REGION };
};
/**
* Asks the user for consent to use the specified CloudShell region
*/
export const askForRegionConsent = async (terminal: Terminal, resolvedRegion: string): Promise<boolean> => {
terminal.writeln(terminalLog.header("CloudShell Region Confirmation"));
terminal.writeln(terminalLog.info("The CloudShell container will be provisioned in a specific Azure region."));
// Data residency and compliance information
terminal.writeln(terminalLog.subheader("Important Information"));
const dbRegion = userContext.databaseAccount?.location || "unknown";
terminal.writeln(terminalLog.item("Database Region", dbRegion));
terminal.writeln(terminalLog.item("CloudShell Container Region", resolvedRegion));
terminal.writeln(terminalLog.subheader("What this means to you?"));
terminal.writeln(terminalLog.item("Data Residency", "Commands and query results will be processed in this region"));
terminal.writeln(terminalLog.item("Network", "Database connections will originate from this region"));
// Consent question
terminal.writeln("");
terminal.writeln(terminalLog.prompt("Would you like to provision Azure CloudShell in the '" + resolvedRegion + "' region?"));
terminal.writeln(terminalLog.prompt("Press 'Y' to continue or 'N' to cancel (Y/N)"));
return new Promise<boolean>((resolve) => {
const keyListener = terminal.onKey(({ key }: { key: string }) => {
keyListener.dispose();
terminal.writeln("");
if (key.toLowerCase() === 'y') {
terminal.writeln(terminalLog.success("Proceeding with CloudShell in " + resolvedRegion));
terminal.writeln(terminalLog.separator());
resolve(true);
} else {
terminal.writeln(terminalLog.error("CloudShell provisioning canceled"));
setTimeout(() => terminal.dispose(), 2000);
resolve(false);
}
});
});
};
/**
* Provisions a CloudShell session
*/
export const provisionCloudShellSession = async (
resolvedRegion: string,
terminal: Terminal,
vNetSettings: object,
isAllPublicAccessEnabled: boolean
): Promise<{ socketUri?: string; provisionConsoleResponse?: any; targetUri?: string }> => {
return new Promise( async (resolve, reject) => {
try {
terminal.writeln(terminalLog.header("Configuring CloudShell Session"));
// Check if vNetSettings is available and not empty
const hasVNetSettings = vNetSettings && Object.keys(vNetSettings).length > 0;
if (hasVNetSettings) {
terminal.writeln(terminalLog.vnet("Enabling private network configuration"));
displayNetworkSettings(terminal, vNetSettings, resolvedRegion);
}
else {
terminal.writeln(terminalLog.warning("No VNet configuration provided"));
terminal.writeln(terminalLog.warning("CloudShell will be provisioned with public network access"));
if (!isAllPublicAccessEnabled) {
terminal.writeln(terminalLog.error("Warning: Your database has network restrictions"));
terminal.writeln(terminalLog.error("CloudShell may not be able to connect without proper VNet configuration"));
}
}
terminal.writeln(terminalLog.warning("Any previous VNet settings will be overridden"));
// Apply user settings
await putEphemeralUserSettings(userContext.subscriptionId, resolvedRegion, vNetSettings);
terminal.writeln(terminalLog.success("Session settings applied"));
// Provision console
let provisionConsoleResponse;
let attemptCounter = 0;
do {
provisionConsoleResponse = await provisionConsole(userContext.subscriptionId, resolvedRegion);
terminal.writeln(terminalLog.progress("Provisioning", provisionConsoleResponse.properties.provisioningState));
attemptCounter++;
if (provisionConsoleResponse.properties.provisioningState !== "Succeeded") {
await wait(POLLING_INTERVAL_MS);
}
} while (provisionConsoleResponse.properties.provisioningState !== "Succeeded" && attemptCounter < 10);
if (provisionConsoleResponse.properties.provisioningState !== "Succeeded") {
const errorMessage = `Provisioning failed: ${provisionConsoleResponse.properties.provisioningState}`;
terminal.writeln(terminalLog.error(errorMessage));
return reject(new Error(errorMessage));
}
// Connect terminal
const connectTerminalResponse = await connectTerminal(
provisionConsoleResponse.properties.uri,
{ rows: terminal.rows, cols: terminal.cols }
);
const targetUri = `${provisionConsoleResponse.properties.uri}/terminals?cols=${terminal.cols}&rows=${terminal.rows}&version=2019-01-01&shell=bash`;
const termId = connectTerminalResponse.id;
// Determine socket URI
let socketUri = connectTerminalResponse.socketUri.replace(":443/", "");
const targetUriBody = targetUri.replace('https://', '').split('?')[0];
if (socketUri.indexOf(targetUriBody) === -1) {
socketUri = `wss://${targetUriBody}/${termId}`;
}
if (targetUriBody.includes('servicebus')) {
const targetUriBodyArr = targetUriBody.split('/');
socketUri = `wss://${targetUriBodyArr[0]}/$hc/${targetUriBodyArr[1]}/terminals/${termId}`;
}
return resolve({ socketUri, provisionConsoleResponse, targetUri });
} catch (err) {
terminal.writeln(terminalLog.error(`Provisioning failed: ${err.message}`));
return reject(err);
}
});
};
/**
* Display VNet settings in the terminal
*/
const displayNetworkSettings = (terminal: Terminal, vNetSettings: any, resolvedRegion: string): void => {
if (vNetSettings.networkProfileResourceId) {
const profileName = vNetSettings.networkProfileResourceId.split('/').pop();
terminal.writeln(terminalLog.item("Network Profile", profileName));
if (vNetSettings.relayNamespaceResourceId) {
const relayName = vNetSettings.relayNamespaceResourceId.split('/').pop();
terminal.writeln(terminalLog.item("Relay Namespace", relayName));
}
terminal.writeln(terminalLog.item("Region", resolvedRegion));
terminal.writeln(terminalLog.success("CloudShell will use this VNet to connect to your database"));
}
};
/**
* Establishes a terminal connection via WebSocket
*/
export const establishTerminalConnection = async (
terminal: Terminal,
shellHandler: any,
socketUri: string,
provisionConsoleResponse: any,
targetUri: string
): Promise<WebSocket> => {
let socket = new WebSocket(socketUri);
// Get shell-specific initial commands
const initCommands = await shellHandler.getInitialCommands();
// Configure the socket
socket = configureSocketConnection(socket, socketUri, terminal, initCommands, 0);
// Attach the terminal addon
const attachAddon = new AttachAddon(socket);
terminal.loadAddon(attachAddon);
terminal.writeln(terminalLog.success("Connection established"));
// Authorize the session
try {
const authorizeResponse = await authorizeSession(provisionConsoleResponse.properties.uri);
const cookieToken = authorizeResponse.token;
// Load auth token with a hidden image
const img = document.createElement("img");
img.src = `${targetUri}&token=${encodeURIComponent(cookieToken)}`;
terminal.focus();
} catch (err) {
terminal.writeln(terminalLog.error("Authorization failed"));
socket.close();
throw err;
}
return socket;
};
/**
* Configures a WebSocket connection for the terminal
*/
export const configureSocketConnection = (
socket: WebSocket,
uri: string,
terminal: Terminal,
initCommands: string,
socketRetryCount: number
): WebSocket => {
let jsonData = '';
let keepAliveID: NodeJS.Timeout = null;
let pingCount = 0;
sendTerminalStartupCommands(socket, initCommands);
socket.onclose = () => {
if (keepAliveID) {
clearTimeout(keepAliveID);
pingCount = 0;
}
terminal.writeln(terminalLog.warning("Session terminated. Refresh the page to start a new session."));
};
socket.onerror = () => {
if (socketRetryCount < MAX_RETRY_COUNT && socket.readyState !== WebSocket.CLOSED) {
configureSocketConnection(socket, uri, terminal, initCommands, socketRetryCount + 1);
} else {
socket.close();
}
};
socket.onmessage = (event: MessageEvent<string>) => {
pingCount = 0; // Reset ping count on message receipt
let eventData = '';
if (typeof event.data === "object") {
try {
const enc = new TextDecoder("utf-8");
eventData = enc.decode(event.data as any);
} catch (e) {
// Not an array buffer, ignore
}
}
if (typeof event.data === 'string') {
eventData = event.data;
}
// Process event data
if (eventData.includes("ie_us") && eventData.includes("ie_ue")) {
const statusData = eventData.split('ie_us')[1].split('ie_ue')[0];
console.log(statusData);
} else if (eventData.includes("ie_us")) {
jsonData += eventData.split('ie_us')[1];
} else if (eventData.includes("ie_ue")) {
jsonData += eventData.split('ie_ue')[0];
console.log(jsonData);
jsonData = '';
} else if (jsonData.length > 0) {
jsonData += eventData;
}
};
return socket;
};
/**
* Sends startup commands to the terminal
*/
export const sendTerminalStartupCommands = (socket: WebSocket, initCommands: string): void => {
let keepAliveID: NodeJS.Timeout = null;
let pingCount = 0;
if (socket && socket.readyState === WebSocket.OPEN) {
socket.send(initCommands);
} else {
socket.onopen = () => {
socket.send(initCommands);
const keepSocketAlive = (socket: WebSocket) => {
if (socket.readyState === WebSocket.OPEN) {
if (pingCount >= MAX_PING_COUNT) {
socket.close();
} else {
socket.send('');
pingCount++;
keepAliveID = setTimeout(() => keepSocketAlive(socket), 1000);
}
}
};
keepSocketAlive(socket);
};
}
};

View File

@@ -0,0 +1,320 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { ApiVersionsConfig, ResourceType } from "Explorer/Tabs/CloudShellTab/DataModels";
import { v4 as uuidv4 } from 'uuid';
import { configContext } from "../../../ConfigContext";
import { TerminalKind } from "../../../Contracts/ViewModels";
import { userContext } from '../../../UserContext';
import { armRequest } from "../../../Utils/arm/request";
import { Authorization, ConnectTerminalResponse, NetworkType, OsType, ProvisionConsoleResponse, SessionType, Settings, ShellType } from "./DataModels";
/**
* API version configuration by terminal type and resource type
*/
const API_VERSIONS : ApiVersionsConfig = {
// Default version for fallback
DEFAULT: "2024-07-01",
// Resource type specific defaults
RESOURCE_DEFAULTS: {
[ResourceType.NETWORK]: "2023-05-01",
[ResourceType.DATABASE]: "2024-07-01",
[ResourceType.VNET]: "2023-05-01",
[ResourceType.SUBNET]: "2023-05-01",
[ResourceType.RELAY]: "2024-01-01",
[ResourceType.ROLE]: "2022-04-01"
},
// Shell-type specific versions with resource overrides
SHELL_TYPES: {
[TerminalKind.Mongo]: {
[ResourceType.DATABASE]: "2024-11-15"
},
[TerminalKind.VCoreMongo]: {
[ResourceType.DATABASE]: "2024-07-01"
},
[TerminalKind.Cassandra]: {
[ResourceType.DATABASE]: "2024-11-15"
}
}
};
export const validateUserSettings = (userSettings: Settings) => {
if (userSettings.sessionType !== SessionType.Ephemeral && userSettings.osType !== OsType.Linux) {
return false;
} else {
return true;
}
}
// Current shell type context
let currentShellType: TerminalKind | null = null;
/**
* Set the active shell type to determine API version
*/
export const setShellType = (shellType: TerminalKind): void => {
currentShellType = shellType;
};
/**
* Get the appropriate API version based on shell type and resource type
* Uses a cascading fallback approach for maximum flexibility
*/
export const getApiVersion = (resourceType?: ResourceType): string => {
// If no shell type is set, fallback to resource default or global default
if (!currentShellType) {
return resourceType ?
(API_VERSIONS.RESOURCE_DEFAULTS[resourceType] || API_VERSIONS.DEFAULT) :
API_VERSIONS.DEFAULT;
}
// Shell type is set, try to get specific version in this priority:
// 1. Shell-specific + resource-specific
if (resourceType &&
API_VERSIONS.SHELL_TYPES[currentShellType]) {
const shellTypeConfig = API_VERSIONS.SHELL_TYPES[currentShellType];
if (resourceType in shellTypeConfig) {
return shellTypeConfig[resourceType] as string;
}
}
// 2. Resource-specific default
if (resourceType && resourceType in API_VERSIONS.RESOURCE_DEFAULTS) {
return API_VERSIONS.RESOURCE_DEFAULTS[resourceType];
}
// 3. Global default
return API_VERSIONS.DEFAULT;
};
export const getUserRegion = async (subscriptionId: string, resourceGroup: string, accountName: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.DocumentDB/databaseAccounts/${accountName}`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const deleteUserSettings = async (): Promise<void> => {
await armRequest<void>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "DELETE",
apiVersion: "2023-02-01-preview"
});
};
export const getUserSettings = async (): Promise<Settings> => {
const resp = await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "GET",
apiVersion: "2023-02-01-preview"
});
return resp;
};
export const putEphemeralUserSettings = async (userSubscriptionId: string, userRegion: string, vNetSettings?: object) => {
const ephemeralSettings = {
properties: {
preferredOsType: OsType.Linux,
preferredShellType: ShellType.Bash,
preferredLocation: userRegion,
networkType: (!vNetSettings || Object.keys(vNetSettings).length === 0) ? NetworkType.Default : (vNetSettings ? NetworkType.Isolated : NetworkType.Default),
sessionType: SessionType.Ephemeral,
userSubscription: userSubscriptionId,
vnetSettings: vNetSettings ?? {}
}
};
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "PUT",
apiVersion: "2023-02-01-preview",
body: ephemeralSettings
});
};
export const verifyCloudShellProviderRegistration = async(subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const registerCloudShellProvider = async (subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell/register`,
method: "POST",
apiVersion: "2022-12-01"
});
};
export const provisionConsole = async (subscriptionId: string, location: string): Promise<ProvisionConsoleResponse> => {
const data = {
properties: {
osType: OsType.Linux
}
};
return await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `providers/Microsoft.Portal/consoles/default`,
method: "PUT",
apiVersion: "2023-02-01-preview",
customHeaders: {
'x-ms-console-preferred-location': location
},
body: data,
});
};
export const connectTerminal = async (consoleUri: string, size: { rows: number, cols: number }): Promise<ConnectTerminalResponse> => {
const targetUri = consoleUri + `/terminals?cols=${size.cols}&rows=${size.rows}&version=2019-01-01&shell=bash`;
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
'Content-Length': '2',
'Authorization': userContext.authorizationToken,
'x-ms-client-request-id': uuidv4(),
'Accept-Language': getLocale(),
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const authorizeSession = async (consoleUri: string): Promise<Authorization> => {
const targetUri = consoleUri + "/authorize";
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Authorization': userContext.authorizationToken,
'Accept-Language': getLocale(),
"Content-Type": 'application/json'
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const getLocale = () => {
const langLocale = navigator.language;
return (langLocale && langLocale.length === 2 ? langLocale[1] : 'en-us');
};
const validCloudShellRegions = new Set(["westus", "southcentralus", "eastus", "northeurope", "westeurope", "centralindia", "southeastasia", "westcentralus"]);
export const getNormalizedRegion = (region: string, defaultCloudshellRegion: string) => {
if (!region) return defaultCloudshellRegion;
const regionMap: Record<string, string> = {
"centralus": "westcentralus",
"eastus2": "eastus"
};
const normalizedRegion = regionMap[region.toLowerCase()] || region;
return validCloudShellRegions.has(normalizedRegion.toLowerCase()) ? normalizedRegion : defaultCloudshellRegion;
};
export async function getNetworkProfileInfo<T>(networkProfileResourceId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await GetARMCall<T>(networkProfileResourceId, apiVersion);
}
export async function getAccountDetails<T>(databaseAccountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(databaseAccountId, apiVersion);
}
export async function getVnetInformation<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function getSubnetInformation<T>(subnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await GetARMCall<T>(subnetId, apiVersion);
}
export async function updateSubnetInformation<T>(subnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await PutARMCall(subnetId, request, apiVersion);
}
export async function updateDatabaseAccount<T>(accountId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await PutARMCall(accountId, request, apiVersion);
}
export async function getDatabaseOperations<T>(accountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(`${accountId}/operations`, apiVersion);
}
export async function updateVnet<T>(vnetId: string, request: object, apiVersionOverride?: string) {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await PutARMCall<T>(vnetId, request, apiVersion);
}
export async function getVnet<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function createNetworkProfile<T>(networkProfileId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(networkProfileId, request, apiVersion);
}
export async function createRelay<T>(relayId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await PutARMCall<T>(relayId, request, apiVersion);
}
export async function getRelay<T>(relayId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await GetARMCall<T>(relayId, apiVersion);
}
export async function createRoleOnNetworkProfile<T>(roleid: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleid, request, apiVersion);
}
export async function createRoleOnRelay<T>(roleid: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleid, request, apiVersion);
}
export async function GetARMCall<T>(path: string, apiVersion: string = API_VERSIONS.DEFAULT): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "GET",
apiVersion: apiVersion
});
}
export async function PutARMCall<T>(path: string, request: object, apiVersion: string = API_VERSIONS.DEFAULT): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "PUT",
apiVersion: apiVersion,
body: request
});
}

View File

@@ -0,0 +1,263 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* CloudShell API client for various operations
*/
import { v4 as uuidv4 } from 'uuid';
import { configContext } from "../../../../ConfigContext";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from '../../../../UserContext';
import { armRequest } from "../../../../Utils/arm/request";
import { ApiVersionsConfig, DEFAULT_API_VERSIONS } from "../Models/ApiVersions";
import { Authorization, ConnectTerminalResponse, NetworkType, OsType, ProvisionConsoleResponse, ResourceType, SessionType, Settings, ShellType } from "../Models/DataModels";
import { getLocale } from '../Data/LocalizationUtils';
// Current shell type context
let currentShellType: TerminalKind | null = null;
/**
* Set the active shell type to determine API version
*/
export const setShellType = (shellType: TerminalKind): void => {
currentShellType = shellType;
};
/**
* Get the appropriate API version based on shell type and resource type
*/
export const getApiVersion = (resourceType?: ResourceType, apiVersions?: ApiVersionsConfig): string => {
if (!apiVersions) {
apiVersions = DEFAULT_API_VERSIONS; // Default fallback
}
// Shell type is set, try to get specific version in this priority:
// 1. Shell-specific + resource-specific
if (resourceType &&
apiVersions.SHELL_TYPES[currentShellType]) {
const shellTypeConfig = apiVersions.SHELL_TYPES[currentShellType];
if (resourceType in shellTypeConfig) {
return shellTypeConfig[resourceType] as string;
}
}
// 2. Resource-specific default
if (resourceType && resourceType in apiVersions.RESOURCE_DEFAULTS) {
return apiVersions.RESOURCE_DEFAULTS[resourceType];
}
// 3. Global default
return apiVersions.DEFAULT;
};
export const getUserRegion = async (subscriptionId: string, resourceGroup: string, accountName: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.DocumentDB/databaseAccounts/${accountName}`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const deleteUserSettings = async (): Promise<void> => {
await armRequest<void>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "DELETE",
apiVersion: "2023-02-01-preview"
});
};
export const getUserSettings = async (): Promise<Settings> => {
const resp = await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "GET",
apiVersion: "2023-02-01-preview"
});
return resp;
};
export const putEphemeralUserSettings = async (userSubscriptionId: string, userRegion: string, vNetSettings?: object) => {
const ephemeralSettings = {
properties: {
preferredOsType: OsType.Linux,
preferredShellType: ShellType.Bash,
preferredLocation: userRegion,
networkType: (!vNetSettings || Object.keys(vNetSettings).length === 0) ? NetworkType.Default : (vNetSettings ? NetworkType.Isolated : NetworkType.Default),
sessionType: SessionType.Ephemeral,
userSubscription: userSubscriptionId,
vnetSettings: vNetSettings ?? {}
}
};
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "PUT",
apiVersion: "2023-02-01-preview",
body: ephemeralSettings
});
};
export const verifyCloudShellProviderRegistration = async(subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const registerCloudShellProvider = async (subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell/register`,
method: "POST",
apiVersion: "2022-12-01"
});
};
export const provisionConsole = async (subscriptionId: string, location: string): Promise<ProvisionConsoleResponse> => {
const data = {
properties: {
osType: OsType.Linux
}
};
return await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `providers/Microsoft.Portal/consoles/default`,
method: "PUT",
apiVersion: "2023-02-01-preview",
customHeaders: {
'x-ms-console-preferred-location': location
},
body: data,
});
};
export const connectTerminal = async (consoleUri: string, size: { rows: number, cols: number }): Promise<ConnectTerminalResponse> => {
const targetUri = consoleUri + `/terminals?cols=${size.cols}&rows=${size.rows}&version=2019-01-01&shell=bash`;
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
'Content-Length': '2',
'Authorization': userContext.authorizationToken,
'x-ms-client-request-id': uuidv4(),
'Accept-Language': getLocale(),
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const authorizeSession = async (consoleUri: string): Promise<Authorization> => {
const targetUri = consoleUri + "/authorize";
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Authorization': userContext.authorizationToken,
'Accept-Language': getLocale(),
"Content-Type": 'application/json'
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export async function getNetworkProfileInfo<T>(networkProfileResourceId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await GetARMCall<T>(networkProfileResourceId, apiVersion);
}
export async function getAccountDetails<T>(databaseAccountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(databaseAccountId, apiVersion);
}
export async function getVnetInformation<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function getSubnetInformation<T>(subnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await GetARMCall<T>(subnetId, apiVersion);
}
export async function updateSubnetInformation<T>(subnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await PutARMCall<T>(subnetId, request, apiVersion);
}
export async function updateDatabaseAccount<T>(accountId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await PutARMCall<T>(accountId, request, apiVersion);
}
export async function getDatabaseOperations<T>(accountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(`${accountId}/operations`, apiVersion);
}
export async function updateVnet<T>(vnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await PutARMCall<T>(vnetId, request, apiVersion);
}
export async function getVnet<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function createNetworkProfile<T>(networkProfileId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(networkProfileId, request, apiVersion);
}
export async function createRelay<T>(relayId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await PutARMCall<T>(relayId, request, apiVersion);
}
export async function getRelay<T>(relayId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await GetARMCall<T>(relayId, apiVersion);
}
export async function createRoleOnNetworkProfile<T>(roleId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleId, request, apiVersion);
}
export async function createRoleOnRelay<T>(roleId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleId, request, apiVersion);
}
export async function createPrivateEndpoint<T>(privateEndpointId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(privateEndpointId, request, apiVersion);
}
export async function GetARMCall<T>(path: string, apiVersion: string = '2024-07-01'): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "GET",
apiVersion: apiVersion
});
}
export async function PutARMCall<T>(path: string, request: object, apiVersion: string = '2024-07-01'): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "PUT",
apiVersion: apiVersion,
body: request
});
}

View File

@@ -0,0 +1,12 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Localization utilities for CloudShell
*/
/**
* Gets the current locale for API requests
*/
export const getLocale = (): string => {
const langLocale = navigator.language;
return (langLocale && langLocale.length > 2 ? langLocale : 'en-us');
};

View File

@@ -0,0 +1,37 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Region utilities for CloudShell
*/
export const getLocale = () => {
const langLocale = navigator.language;
return (langLocale && langLocale.length === 2 ? langLocale[1] : 'en-us');
};
const validCloudShellRegions = new Set([
"westus",
"southcentralus",
"eastus",
"northeurope",
"westeurope",
"centralindia",
"southeastasia",
"westcentralus"
]);
/**
* Normalizes a region name to a valid CloudShell region
* @param region The region to normalize
* @param defaultCloudshellRegion Default region to use if the provided region is not supported
*/
export const getNormalizedRegion = (region: string, defaultCloudshellRegion: string) => {
if (!region) return defaultCloudshellRegion;
const regionMap: Record<string, string> = {
"centralus": "westcentralus",
"eastus2": "eastus"
};
const normalizedRegion = regionMap[region.toLowerCase()] || region;
return validCloudShellRegions.has(normalizedRegion.toLowerCase()) ? normalizedRegion : defaultCloudshellRegion;
};

View File

@@ -0,0 +1,185 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { TerminalKind } from "../../../Contracts/ViewModels";
export const enum OsType {
Linux = "linux",
Windows = "windows"
}
export const enum ShellType {
Bash = "bash",
PowerShellCore = "pwsh"
}
export const enum NetworkType {
Default = "Default",
Isolated = "Isolated"
}
export const enum SessionType {
Mounted = "Mounted",
Ephemeral = "Ephemeral"
}
export const enum UserInputs {
NoReset = "1",
ConfigureVNet = "2",
ResetVNet = "3"
};
export type Settings = {
properties: UserSettingProperties
};
export type UserSettingProperties = {
networkType: string;
preferredLocation: string;
preferredOsType: OsType;
preferredShellType: ShellType;
userSubscription: string;
sessionType: SessionType;
vnetSettings: VnetSettings;
}
export type VnetSettings = {
networkProfileResourceId?: string;
relayNamespaceResourceId?: string;
location?: string;
}
export type ProvisionConsoleResponse = {
properties: {
osType: OsType;
provisioningState: string;
uri: string;
};
};
export type Authorization = {
token: string;
};
export type ConnectTerminalResponse = {
id: string;
idleTimeout: string;
rootDirectory: string;
socketUri: string;
tokenUpdated: boolean;
};
export type VnetModel = {
name: string;
id: string;
etag: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
provisioningState: string;
resourceGuid: string;
addressSpace: {
addressPrefixes: string[];
};
encryption: {
enabled: boolean;
enforcement: string;
};
privateEndpointVNetPolicies: string;
subnets: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
addressPrefixes?: string[];
addressPrefix?: string;
networkSecurityGroup?: { id: string };
ipConfigurations?: { id: string }[];
ipConfigurationProfiles?: { id: string }[];
privateEndpoints?: { id: string }[];
serviceEndpoints?: Array<{
provisioningState: string;
service: string;
locations: string[];
}>;
delegations?: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
serviceName: string;
actions: string[];
};
}>;
purpose?: string;
privateEndpointNetworkPolicies?: string;
privateLinkServiceNetworkPolicies?: string;
};
}>;
virtualNetworkPeerings: any[];
enableDdosProtection: boolean;
};
};
export type RelayNamespace = {
id: string;
name: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
metricId: string;
serviceBusEndpoint: string;
provisioningState: string;
status: string;
createdAt: string;
updatedAt: string;
};
sku: {
name: string;
tier: string;
};
};
export type RelayNamespaceResponse = {
value: RelayNamespace[];
};
/**
* Resource types for API versioning
*/
export enum ResourceType {
NETWORK = "NETWORK",
DATABASE = "DATABASE",
VNET = "VNET",
SUBNET = "SUBNET",
RELAY = "RELAY",
ROLE = "ROLE"
}
// Type definition for API_VERSIONS configuration
export type ApiVersionsConfig = {
// Global default API version
DEFAULT: string;
// Resource-specific default API versions
RESOURCE_DEFAULTS: {
[key in ResourceType]: string;
};
// Shell-type specific configurations
SHELL_TYPES: {
[key in TerminalKind]?: {
// Resource-specific overrides for this shell type
[key in ResourceType]?: string;
};
};
};

View File

@@ -0,0 +1,29 @@
/**
* Standardized terminal logging functions for consistent formatting
*/
export const terminalLog = {
// Section headers
header: (message: string) => `\n\x1B[1;34m┌─ ${message} ${"─".repeat(Math.max(45 - message.length, 0))}\x1B[0m`,
subheader: (message: string) => `\x1B[1;36m├ ${message}\x1B[0m`,
sectionEnd: () => `\x1B[1;34m└${"─".repeat(50)}\x1B[0m\n`,
// Status messages
success: (message: string) => `\x1B[32m✓ ${message}\x1B[0m`,
warning: (message: string) => `\x1B[33m⚠ ${message}\x1B[0m`,
error: (message: string) => `\x1B[31m✗ ${message}\x1B[0m`,
info: (message: string) => `\x1B[34m${message}\x1B[0m`,
// Resource information
database: (message: string) => `\x1B[35m🔶 Database: ${message}\x1B[0m`,
vnet: (message: string) => `\x1B[36m🔷 Network: ${message}\x1B[0m`,
cloudshell: (message: string) => `\x1B[32m🔷 CloudShell: ${message}\x1B[0m`,
// Data formatting
item: (label: string, value: string) => `${label}: \x1B[32m${value}\x1B[0m`,
progress: (operation: string, status: string) => `\x1B[34m${operation}: \x1B[36m${status}\x1B[0m`,
// User interaction
prompt: (message: string) => `\x1B[1;37m${message}\x1B[0m`,
separator: () => `\x1B[30;1m${"─".repeat(50)}\x1B[0m`
};

View File

@@ -0,0 +1,74 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* API versions configuration for CloudShell
*/
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { ResourceType } from "./DataModels";
/**
* Configuration for API versions used by the CloudShell
*/
export type ApiVersionsConfig = {
DEFAULT: string;
RESOURCE_DEFAULTS: Record<ResourceType, string>;
SHELL_TYPES: Record<TerminalKind, Record<ResourceType, string>>;
}
/**
* Default API versions configuration
*/
export const DEFAULT_API_VERSIONS: ApiVersionsConfig = {
DEFAULT: '2024-07-01',
RESOURCE_DEFAULTS: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2022-10-01',
[ResourceType.ROLE]: '2022-04-01',
},
SHELL_TYPES: {
[TerminalKind.Mongo]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.VCoreMongo]: {
[ResourceType.DATABASE]: '2024-07-01',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Postgres]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Cassandra]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Default]: {
[ResourceType.DATABASE]: undefined,
[ResourceType.NETWORK]: undefined,
[ResourceType.VNET]: undefined,
[ResourceType.SUBNET]: undefined,
[ResourceType.RELAY]: undefined,
[ResourceType.ROLE]: undefined,
},
},
};

View File

@@ -0,0 +1,163 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Data models for CloudShell
*/
export const enum OsType {
Linux = "linux",
Windows = "windows"
}
export const enum ShellType {
Bash = "bash",
PowerShellCore = "pwsh"
}
export const enum NetworkType {
Default = "Default",
Isolated = "Isolated"
}
export const enum SessionType {
Mounted = "Mounted",
Ephemeral = "Ephemeral"
}
export const enum UserInputs {
NoReset = "1",
ConfigureVNet = "2",
ResetVNet = "3"
};
export type Settings = {
properties: UserSettingProperties
};
export type UserSettingProperties = {
networkType: string;
preferredLocation: string;
preferredOsType: OsType;
preferredShellType: ShellType;
userSubscription: string;
sessionType: SessionType;
vnetSettings: VnetSettings;
}
export type VnetSettings = {
networkProfileResourceId?: string;
relayNamespaceResourceId?: string;
location?: string;
}
export type ProvisionConsoleResponse = {
properties: {
osType: OsType;
provisioningState: string;
uri: string;
};
};
export type Authorization = {
token: string;
};
export type ConnectTerminalResponse = {
id: string;
idleTimeout: string;
rootDirectory: string;
socketUri: string;
tokenUpdated: boolean;
};
export type VnetModel = {
name: string;
id: string;
etag: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
provisioningState: string;
resourceGuid: string;
addressSpace: {
addressPrefixes: string[];
};
encryption: {
enabled: boolean;
enforcement: string;
};
privateEndpointVNetPolicies: string;
subnets: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
addressPrefixes?: string[];
addressPrefix?: string;
networkSecurityGroup?: { id: string };
ipConfigurations?: { id: string }[];
ipConfigurationProfiles?: { id: string }[];
privateEndpoints?: { id: string }[];
serviceEndpoints?: Array<{
provisioningState: string;
service: string;
locations: string[];
}>;
delegations?: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
serviceName: string;
actions: string[];
};
}>;
purpose?: string;
privateEndpointNetworkPolicies?: string;
privateLinkServiceNetworkPolicies?: string;
};
}>;
virtualNetworkPeerings: any[];
enableDdosProtection: boolean;
};
};
export type RelayNamespace = {
id: string;
name: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
metricId: string;
serviceBusEndpoint: string;
provisioningState: string;
status: string;
createdAt: string;
updatedAt: string;
};
sku: {
name: string;
tier: string;
};
};
export type RelayNamespaceResponse = {
value: RelayNamespace[];
};
/**
* Resource types for API versioning
*/
export enum ResourceType {
NETWORK = "NETWORK",
DATABASE = "DATABASE",
VNET = "VNET",
SUBNET = "SUBNET",
RELAY = "RELAY",
ROLE = "ROLE"
}

View File

@@ -0,0 +1,94 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Firewall handling functionality for CloudShell
*/
import { Terminal } from "xterm";
import { userContext } from "../../../../UserContext";
import { hasFirewallRestrictions } from "../../Shared/CheckFirewallRules";
import { getAccountDetails, updateDatabaseAccount } from "../Data/CloudShellApiClient";
import { askConfirmation } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
export class FirewallHandler {
/**
* Checks if firewall configuration is needed for CloudShell
*/
public static async checkFirewallConfiguration(terminal: Terminal): Promise<boolean> {
if (!hasFirewallRestrictions()) {
return false; // No firewall rules to configure
}
terminal.writeln(terminalLog.header("Database Firewall Configuration"));
terminal.writeln(terminalLog.warning("Your database has firewall restrictions enabled"));
terminal.writeln(terminalLog.warning("CloudShell might need access through these restrictions"));
const shouldConfigureFirewall = await askConfirmation(
terminal,
"Would you like to check and configure firewall settings?"
);
if (!shouldConfigureFirewall) {
terminal.writeln(terminalLog.info("Skipping firewall configuration"));
return false;
}
return await this.configureFirewallForCloudShell(terminal);
}
/**
* Configures firewall for CloudShell access
*/
private static async configureFirewallForCloudShell(terminal: Terminal): Promise<boolean> {
try {
// Get current database account details
terminal.writeln(terminalLog.database("Retrieving current firewall configuration..."));
const dbAccount = userContext.databaseAccount;
const currentDbAccount = await getAccountDetails(dbAccount.id);
// Check if "Allow Azure Services" is already enabled
const ipRules = currentDbAccount.properties.ipRules || [];
const azureServicesEnabled = currentDbAccount.properties.publicNetworkAccess === "Enabled";
if (azureServicesEnabled) {
terminal.writeln(terminalLog.success("Azure services access is already enabled"));
return true;
}
// Ask user to enable Azure services access
terminal.writeln(terminalLog.warning("Azure services access is not enabled"));
terminal.writeln(terminalLog.info("CloudShell requires 'Allow Azure Services' to be enabled"));
const enableAzureServices = await askConfirmation(
terminal,
"Enable 'Allow Azure Services' for this database?"
);
if (!enableAzureServices) {
terminal.writeln(terminalLog.warning("CloudShell may not be able to connect without enabling Azure services access"));
return false;
}
// Update database account to enable Azure services access
terminal.writeln(terminalLog.info("Updating database firewall configuration..."));
// Create update payload - only modify firewall-related properties
const updatePayload = {
...currentDbAccount,
properties: {
...currentDbAccount.properties,
publicNetworkAccess: "Enabled"
}
};
await updateDatabaseAccount(dbAccount.id, updatePayload);
terminal.writeln(terminalLog.success("Database firewall updated successfully"));
terminal.writeln(terminalLog.success("Azure services access is now enabled"));
return true;
} catch (error) {
terminal.writeln(terminalLog.error(`Error configuring firewall: ${error.message}`));
return false;
}
}
}

View File

@@ -0,0 +1,99 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Network access configuration handler for CloudShell
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { IsPublicAccessAvailable } from "../../Shared/CheckFirewallRules";
import { getUserSettings } from "../Data/CloudShellApiClient";
import { VnetSettings } from "../Models/DataModels";
import { terminalLog } from "../Utils/LogFormatter";
import { VNetHandler } from "./VNetHandler";
export class NetworkAccessHandler {
/**
* Configures network access for the CloudShell based on shell type and network restrictions
*/
public static async configureNetworkAccess(
terminal: Terminal,
region: string,
shellType: TerminalKind
): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
// Check if public access is available for this shell type
const isAllPublicAccessEnabled = await IsPublicAccessAvailable(shellType);
// If public access is enabled, no need for VNet configuration
if (isAllPublicAccessEnabled) {
terminal.writeln(terminalLog.database("Public access enabled. Skipping VNet configuration."));
return {
vNetSettings: {},
isAllPublicAccessEnabled: true
};
}
// Public access is restricted, we need to configure a VNet or use existing one
terminal.writeln(terminalLog.database("Network restrictions detected"));
terminal.writeln(terminalLog.info("Loading CloudShell configuration..."));
// Get existing settings if available
const settings = await getUserSettings();
if (!settings) {
terminal.writeln(terminalLog.warning("No existing user settings found."));
}
// Retrieve CloudShell VNet settings if available
let cloudShellVnetSettings: VnetSettings | undefined;
if (settings) {
cloudShellVnetSettings = await VNetHandler.retrieveCloudShellVnetSettings(settings, terminal);
}
// If CloudShell has VNet settings, check with database config
let finalVNetSettings = {};
if (cloudShellVnetSettings && cloudShellVnetSettings.networkProfileResourceId) {
// Check if we should use existing VNet settings
const isContinueWithSameVnet = await VNetHandler.askForVNetConfigConsent(terminal, shellType);
if (isContinueWithSameVnet) {
// Check if the VNet is already configured in the database
const isVNetInDatabaseConfig = await VNetHandler.isCloudShellVNetInDatabaseConfig(cloudShellVnetSettings, terminal);
if (!isVNetInDatabaseConfig) {
terminal.writeln(terminalLog.warning("CloudShell VNet is not configured in database access list"));
const addToDatabase = await VNetHandler.askToAddVNetToDatabase(terminal, cloudShellVnetSettings);
if (addToDatabase) {
await VNetHandler.addCloudShellVNetToDatabase(cloudShellVnetSettings, terminal);
finalVNetSettings = cloudShellVnetSettings;
} else {
// User declined to add VNet to database, need to recreate
terminal.writeln(terminalLog.warning("Will configure new VNet..."));
cloudShellVnetSettings = undefined;
}
} else {
terminal.writeln(terminalLog.success("CloudShell VNet is already in database configuration"));
finalVNetSettings = cloudShellVnetSettings;
}
} else {
cloudShellVnetSettings = undefined; // User declined to use existing VNet settings
}
}
// If we don't have valid VNet settings, create new ones
if (!cloudShellVnetSettings || !cloudShellVnetSettings.networkProfileResourceId) {
terminal.writeln(terminalLog.subheader("Configuring network infrastructure"));
finalVNetSettings = await VNetHandler.configureCloudShellVNet(terminal, region);
// Add the new VNet to the database
await VNetHandler.addCloudShellVNetToDatabase(finalVNetSettings as VnetSettings, terminal);
}
return {
vNetSettings: finalVNetSettings,
isAllPublicAccessEnabled: false
};
}
}

View File

@@ -0,0 +1,894 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* VNet handling functionality for CloudShell
*/
import { v4 as uuidv4 } from 'uuid';
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { hasPrivateEndpointsRestrictions } from "../../Shared/CheckFirewallRules";
import {
createNetworkProfile,
createPrivateEndpoint,
createRelay,
createRoleOnNetworkProfile,
createRoleOnRelay,
getAccountDetails,
getDatabaseOperations,
getNetworkProfileInfo,
getRelay,
getSubnetInformation,
getVnet,
getVnetInformation,
updateDatabaseAccount,
updateSubnetInformation,
updateVnet
} from "../Data/CloudShellApiClient";
import { Settings, VnetSettings } from "../Models/DataModels";
import { askConfirmation, askQuestion, wait } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
// Constants for VNet configuration
const POLLING_INTERVAL_MS = 5000;
const MAX_RETRY_COUNT = 10;
const STANDARD_SKU = "Standard";
const DEFAULT_VNET_ADDRESS_PREFIX = "10.0.0.0/16";
const DEFAULT_SUBNET_ADDRESS_PREFIX = "10.0.1.0/24";
const DEFAULT_CONTAINER_INSTANCE_OID = "88536fb9-d60a-4aee-8195-041425d6e927";
export class VNetHandler {
/**
* Retrieves CloudShell VNet settings from user settings
*/
public static async retrieveCloudShellVnetSettings(settings: Settings, terminal: Terminal): Promise<VnetSettings> {
if (settings?.properties?.vnetSettings && Object.keys(settings.properties.vnetSettings).length > 0) {
try {
const netProfileInfo = await getNetworkProfileInfo<any>(settings.properties.vnetSettings.networkProfileResourceId);
terminal.writeln(terminalLog.header("Existing Network Configuration"));
const subnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
const vnetResourceId = subnetId.replace(/\/subnets\/[^/]+$/, '');
terminal.writeln(terminalLog.item("VNet", vnetResourceId));
terminal.writeln(terminalLog.item("Subnet", subnetId));
terminal.writeln(terminalLog.item("Location", settings.properties.vnetSettings.location));
terminal.writeln(terminalLog.item("Network Profile", settings.properties.vnetSettings.networkProfileResourceId));
terminal.writeln(terminalLog.item("Relay Namespace", settings.properties.vnetSettings.relayNamespaceResourceId));
return {
networkProfileResourceId: settings.properties.vnetSettings.networkProfileResourceId,
relayNamespaceResourceId: settings.properties.vnetSettings.relayNamespaceResourceId,
location: settings.properties.vnetSettings.location
};
} catch (err) {
terminal.writeln(terminalLog.warning("Error retrieving network profile. Will configure new network."));
return undefined;
}
}
return undefined;
}
/**
* Asks the user if they want to use existing network configuration (VNet or private endpoint)
*/
public static async askForVNetConfigConsent(terminal: Terminal, shellType: TerminalKind = null): Promise<boolean> {
// Check if this shell type supports only private endpoints
const isPrivateEndpointOnlyShell = shellType === TerminalKind.VCoreMongo;
// Check if the database has private endpoints configured
const hasPrivateEndpoints = hasPrivateEndpointsRestrictions();
// Determine which network type to mention based on shell type and database configuration
const networkType = isPrivateEndpointOnlyShell || hasPrivateEndpoints ? "private endpoint" : "network";
// Ask for consent
terminal.writeln("");
terminal.writeln(terminalLog.prompt(`Use this existing ${networkType} configuration?`));
terminal.writeln(terminalLog.info(`Answering 'N' will configure a new ${networkType} for CloudShell`));
return await askConfirmation(terminal, `Press Y/N to continue...`);
}
/**
* Checks if the CloudShell VNet is already in the database configuration
*/
public static async isCloudShellVNetInDatabaseConfig(vNetSettings: VnetSettings, terminal: Terminal): Promise<boolean> {
try {
terminal.writeln(terminalLog.subheader("Verifying if CloudShell VNet is configured in database"));
// Get the subnet ID from the CloudShell Network Profile
const netProfileInfo = await getNetworkProfileInfo<any>(vNetSettings.networkProfileResourceId);
if (!netProfileInfo?.properties?.containerNetworkInterfaceConfigurations?.[0]
?.properties?.ipConfigurations?.[0]?.properties?.subnet?.id) {
terminal.writeln(terminalLog.warning("Could not retrieve subnet ID from CloudShell VNet"));
return false;
}
const cloudShellSubnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
terminal.writeln(terminalLog.item("CloudShell Subnet", cloudShellSubnetId.split('/').pop() || ""));
// Check if this subnet ID is in the database VNet rules
const dbAccount = userContext.databaseAccount;
if (!dbAccount?.properties?.virtualNetworkRules) {
return false;
}
const vnetRules = dbAccount.properties.virtualNetworkRules;
// Check if the CloudShell subnet is already in the rules
return vnetRules.some(rule => rule.id === cloudShellSubnetId);
} catch (err) {
terminal.writeln(terminalLog.error("Error checking database VNet configuration"));
return false;
}
}
/**
* Asks the user if they want to add the CloudShell VNet to the database configuration
*/
public static async askToAddVNetToDatabase(terminal: Terminal, vNetSettings: VnetSettings): Promise<boolean> {
terminal.writeln("");
terminal.writeln(terminalLog.header("Network Configuration Mismatch"));
terminal.writeln(terminalLog.warning("Your CloudShell VNet is not in your database's allowed networks"));
terminal.writeln(terminalLog.warning("To connect from CloudShell, this VNet must be added to your database"));
return await askConfirmation(terminal, "Add CloudShell VNet to database configuration?");
}
/**
* Adds the CloudShell VNet to the database configuration
* Now supports both VNet rules and private endpoints
*/
public static async addCloudShellVNetToDatabase(vNetSettings: VnetSettings, terminal: Terminal): Promise<void> {
try {
terminal.writeln(terminalLog.header("Updating database network configuration"));
// Step 1: Get the subnet ID from CloudShell Network Profile
const { cloudShellSubnetId, cloudShellVnetId } = await this.getCloudShellNetworkIds(vNetSettings, terminal);
// Step 2: Get current database account details
const { currentDbAccount } = await this.getDatabaseAccountDetails(terminal);
// Step 3: Determine if database uses private endpoints
const usesPrivateEndpoints = hasPrivateEndpointsRestrictions() ||
(currentDbAccount.properties.privateEndpointConnections?.length > 0);
// Log which networking mode we're using
if (usesPrivateEndpoints) {
terminal.writeln(terminalLog.info("Database is configured with private endpoints"));
} else {
terminal.writeln(terminalLog.info("Database is configured with VNet rules"));
}
// Step 4: Check if connection is already configured
if (usesPrivateEndpoints) {
if (await this.isPrivateEndpointAlreadyConfigured(cloudShellVnetId, currentDbAccount, terminal)) {
return;
}
} else {
if (await this.isVNetAlreadyConfigured(cloudShellSubnetId, currentDbAccount, terminal)) {
return;
}
}
// Step 5: Check network resource statuses and ongoing operations
const { vnetInfo, subnetInfo, operationInProgress } =
await this.checkNetworkResourceStatuses(cloudShellSubnetId, cloudShellVnetId, currentDbAccount.id, terminal);
// Step 6: If no operation in progress, update the configuration
if (!operationInProgress) {
if (usesPrivateEndpoints) {
// Create or update private endpoint configuration
await this.configurePrivateEndpoint(
cloudShellSubnetId,
vnetInfo.location,
currentDbAccount.id,
terminal
);
} else {
// Enable CosmosDB service endpoint on subnet if needed (for VNet rules)
await this.enableCosmosDBServiceEndpoint(cloudShellSubnetId, subnetInfo, terminal);
// Update database account with VNet rule
await this.updateDatabaseWithVNetRule(currentDbAccount, cloudShellSubnetId, currentDbAccount.id, terminal);
}
} else {
terminal.writeln(terminalLog.info("Monitoring existing network operation..."));
// Step 7: Monitor the update progress
await this.monitorVNetAdditionProgress(cloudShellSubnetId, currentDbAccount.id, terminal);
}
} catch (err) {
terminal.writeln(terminalLog.error(`Error updating database network configuration: ${err.message}`));
throw err;
}
}
/**
* Checks if a private endpoint is already configured for the CloudShell VNet
*/
private static async isPrivateEndpointAlreadyConfigured(
cloudShellVnetId: string,
currentDbAccount: any,
terminal: Terminal
): Promise<boolean> {
// Check if private endpoints exist and are properly configured for this VNet
const hasConfiguredEndpoint = currentDbAccount.properties.privateEndpointConnections?.some(
(connection: any) => {
const isApproved = connection.properties.privateLinkServiceConnectionState.status === 'Approved';
// We would need to check if the endpoint is in the CloudShell VNet
// For simplicity, we're assuming connection.properties.networkInterface contains this info
const endpointVNetId = connection.properties.networkInterface?.id?.split('/subnets/')[0];
return isApproved && endpointVNetId === cloudShellVnetId;
}
);
if (hasConfiguredEndpoint) {
terminal.writeln(terminalLog.success("CloudShell private endpoint is already configured"));
return true;
}
return false;
}
/**
* Configures a private endpoint for the CloudShell VNet to connect to the database
*/
private static async configurePrivateEndpoint(
cloudShellSubnetId: string,
vnetLocation: any,
dbAccountId: string,
terminal: Terminal
): Promise<void> {
// Extract necessary information from IDs
const subnetIdParts = cloudShellSubnetId.split('/');
const subnetIndex = subnetIdParts.indexOf('subnets');
const subnetName = subnetIdParts[subnetIndex + 1];
const resourceGroup = subnetIdParts[4];
const subscriptionId = subnetIdParts[2];
// Generate a unique name for the private endpoint
const privateEndpointName = `pe-cloudshell-cosmos-${Math.floor(10000 + Math.random() * 90000)}`;
terminal.writeln(terminalLog.subheader("Creating private endpoint for CloudShell"));
terminal.writeln(terminalLog.item("Private Endpoint Name", privateEndpointName));
terminal.writeln(terminalLog.item("Target Subnet", subnetName));
// Construct the private endpoint creation payload
const privateEndpointPayload = {
location: vnetLocation,
properties: {
privateLinkServiceConnections: [
{
name: privateEndpointName,
properties: {
privateLinkServiceId: dbAccountId,
groupIds: [
"MongoDB"
],
requestMessage: "CloudShell connectivity request"
},
type: "Microsoft.Network/privateEndpoints/privateLinkServiceConnections"
}
],
subnet: {
id: cloudShellSubnetId
}
}
};
// Send the request to create the private endpoint
// Note: This is a placeholder - we would need to implement this API call
terminal.writeln(terminalLog.info("Submitting private endpoint creation request"));
try {
const privateEndpointUrl = `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.Network/privateEndpoints/${privateEndpointName}`;
await createPrivateEndpoint(privateEndpointUrl, privateEndpointPayload, "2024-05-01");
terminal.writeln(terminalLog.success("Private endpoint creation request submitted"));
terminal.writeln(terminalLog.warning("Please approve the private endpoint connection in the Azure portal"));
terminal.writeln(terminalLog.info("Note: Private endpoint operations may take several minutes to complete"));
} catch (err) {
terminal.writeln(terminalLog.error(`Failed to create private endpoint: ${err.message}`));
throw err;
}
}
/**
* Gets the subnet and VNet IDs from CloudShell Network Profile
*/
private static async getCloudShellNetworkIds(vNetSettings: VnetSettings, terminal: Terminal): Promise<{ cloudShellSubnetId: string; cloudShellVnetId: string }> {
const netProfileInfo = await getNetworkProfileInfo<any>(vNetSettings.networkProfileResourceId);
if (!netProfileInfo?.properties?.containerNetworkInterfaceConfigurations?.[0]
?.properties?.ipConfigurations?.[0]?.properties?.subnet?.id) {
throw new Error("Could not retrieve subnet ID from CloudShell VNet");
}
const cloudShellSubnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
// Extract VNet ID from subnet ID
const cloudShellVnetId = cloudShellSubnetId.substring(0, cloudShellSubnetId.indexOf('/subnets/'));
terminal.writeln(terminalLog.subheader("Identified CloudShell network resources"));
terminal.writeln(terminalLog.item("Subnet", cloudShellSubnetId.split('/').pop() || ""));
terminal.writeln(terminalLog.item("VNet", cloudShellVnetId.split('/').pop() || ""));
return { cloudShellSubnetId, cloudShellVnetId };
}
/**
* Gets the database account details
*/
private static async getDatabaseAccountDetails(terminal: Terminal): Promise<{ currentDbAccount: any }> {
const dbAccount = userContext.databaseAccount;
terminal.writeln(terminalLog.database("Verifying current configuration"));
const currentDbAccount = await getAccountDetails(dbAccount.id);
return { currentDbAccount };
}
/**
* Checks if the VNet is already configured in the database
*/
private static async isVNetAlreadyConfigured(cloudShellSubnetId: string, currentDbAccount: any, terminal: Terminal): Promise<boolean> {
const vnetAlreadyConfigured = currentDbAccount.properties.virtualNetworkRules &&
currentDbAccount.properties.virtualNetworkRules.some(
(rule: any) => rule.id === cloudShellSubnetId
);
if (vnetAlreadyConfigured) {
terminal.writeln(terminalLog.success("CloudShell VNet is already in database configuration"));
return true;
}
return false;
}
/**
* Checks the status of network resources and ongoing operations
*/
private static async checkNetworkResourceStatuses(
cloudShellSubnetId: string,
cloudShellVnetId: string,
dbAccountId: string,
terminal: Terminal
): Promise<{ vnetInfo: any; subnetInfo: any; operationInProgress: boolean }> {
terminal.writeln(terminalLog.subheader("Checking network resource status"));
let operationInProgress = false;
let vnetInfo: any = null;
let subnetInfo: any = null;
if (cloudShellVnetId && cloudShellSubnetId) {
// Get VNet and subnet resource status
vnetInfo = await getVnetInformation<any>(cloudShellVnetId);
subnetInfo = await getSubnetInformation<any>(cloudShellSubnetId);
// Check if there's an ongoing operation on the VNet or subnet
const vnetProvisioningState = vnetInfo?.properties?.provisioningState;
const subnetProvisioningState = subnetInfo?.properties?.provisioningState;
if (vnetProvisioningState !== 'Succeeded' && vnetProvisioningState !== 'Failed') {
terminal.writeln(terminalLog.warning(`VNet operation in progress: ${vnetProvisioningState}`));
operationInProgress = true;
}
if (subnetProvisioningState !== 'Succeeded' && subnetProvisioningState !== 'Failed') {
terminal.writeln(terminalLog.warning(`Subnet operation in progress: ${subnetProvisioningState}`));
operationInProgress = true;
}
// Also check database operations
const latestDbAccount = await getAccountDetails<any>(dbAccountId);
if (latestDbAccount.properties.virtualNetworkRules) {
const isPendingAdd = latestDbAccount.properties.virtualNetworkRules.some(
(rule: any) => rule.id === cloudShellSubnetId && rule.status === 'Updating'
);
if (isPendingAdd) {
terminal.writeln(terminalLog.warning("CloudShell VNet addition to database is already in progress"));
operationInProgress = true;
}
}
}
return { vnetInfo, subnetInfo, operationInProgress };
}
/**
* Enables the CosmosDB service endpoint on a subnet if needed
*/
private static async enableCosmosDBServiceEndpoint(cloudShellSubnetId: string, subnetInfo: any, terminal: Terminal): Promise<void> {
if (!subnetInfo) {
terminal.writeln(terminalLog.warning("Unable to check subnet endpoint configuration"));
return;
}
terminal.writeln(terminalLog.subheader("Checking and configuring CosmosDB service endpoint"));
// Parse the subnet ID to get resource information
const subnetIdParts = cloudShellSubnetId.split('/');
const subnetIndex = subnetIdParts.indexOf('subnets');
if (subnetIndex > 0) {
const subnetName = subnetIdParts[subnetIndex + 1];
const vnetName = subnetIdParts[subnetIndex - 1];
const resourceGroup = subnetIdParts[4];
const subscriptionId = subnetIdParts[2];
// Get the subnet URL
const subnetUrl = `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}/subnets/${subnetName}`;
// Check if CosmosDB service endpoint is already enabled
const hasCosmosDBEndpoint = subnetInfo.properties.serviceEndpoints &&
subnetInfo.properties.serviceEndpoints.some(
(endpoint: any) => endpoint.service === 'Microsoft.AzureCosmosDB'
);
if (!hasCosmosDBEndpoint) {
terminal.writeln(terminalLog.warning("Enabling CosmosDB service endpoint on subnet..."));
// Create update payload with CosmosDB service endpoint
const serviceEndpoints = [
...(subnetInfo.properties.serviceEndpoints || []),
{ service: 'Microsoft.AzureCosmosDB' }
];
// Update the subnet configuration while preserving existing properties
const subnetUpdatePayload = {
...subnetInfo,
properties: {
...subnetInfo.properties,
serviceEndpoints: serviceEndpoints
}
};
// Apply the subnet update
await updateSubnetInformation(subnetUrl, subnetUpdatePayload);
// Wait for the subnet update to complete
let subnetUpdateComplete = false;
let subnetRetryCount = 0;
while (!subnetUpdateComplete && subnetRetryCount < MAX_RETRY_COUNT) {
const updatedSubnet = await getSubnetInformation<any>(subnetUrl);
const endpointEnabled = updatedSubnet.properties.serviceEndpoints &&
updatedSubnet.properties.serviceEndpoints.some(
(endpoint: any) => endpoint.service === 'Microsoft.AzureCosmosDB'
);
if (endpointEnabled && updatedSubnet.properties.provisioningState === 'Succeeded') {
subnetUpdateComplete = true;
terminal.writeln(terminalLog.success("CosmosDB service endpoint enabled successfully"));
} else {
subnetRetryCount++;
terminal.writeln(terminalLog.progress("Subnet update", `Waiting (${subnetRetryCount}/${MAX_RETRY_COUNT})`));
await wait(POLLING_INTERVAL_MS);
}
}
if (!subnetUpdateComplete) {
throw new Error("Failed to enable CosmosDB service endpoint on subnet");
}
} else {
terminal.writeln(terminalLog.success("CosmosDB service endpoint is already enabled"));
}
}
}
/**
* Updates the database account with a new VNet rule
*/
private static async updateDatabaseWithVNetRule(currentDbAccount: any, cloudShellSubnetId: string, dbAccountId: string, terminal: Terminal): Promise<void> {
// Create a deep copy of the current database account
const updatePayload = JSON.parse(JSON.stringify(currentDbAccount));
// Update only the network-related properties
updatePayload.properties.virtualNetworkRules = [
...(currentDbAccount.properties.virtualNetworkRules || []),
{ id: cloudShellSubnetId, ignoreMissingVNetServiceEndpoint: false }
];
updatePayload.properties.isVirtualNetworkFilterEnabled = true;
// Update the database account
terminal.writeln(terminalLog.subheader("Submitting VNet update request to database"));
await updateDatabaseAccount(dbAccountId, updatePayload);
terminal.writeln(terminalLog.success("Updated Database account with Cloud Shell Vnet"));
}
/**
* Monitors the progress of adding a VNet to the database account
*/
private static async monitorVNetAdditionProgress(cloudShellSubnetId: string, dbAccountId: string, terminal: Terminal): Promise<void> {
let updateComplete = false;
let retryCount = 0;
let lastStatus = "";
let lastProgress = 0;
let lastOpId = "";
terminal.writeln(terminalLog.subheader("Monitoring database update progress"));
while (!updateComplete && retryCount < MAX_RETRY_COUNT) {
// Check if the VNet is now in the database account
const updatedDbAccount = await getAccountDetails<any>(dbAccountId);
const isVNetAdded = updatedDbAccount.properties.virtualNetworkRules?.some(
(rule: any) => rule.id === cloudShellSubnetId && (!rule.status || rule.status === 'Succeeded')
);
if (isVNetAdded) {
updateComplete = true;
terminal.writeln(terminalLog.success("CloudShell VNet successfully added to database configuration"));
break;
}
// If not yet added, check for operation progress
const operations = await getDatabaseOperations<any>(dbAccountId);
// Find network-related operations
const networkOps = operations.value?.filter(
(op: any) =>
(op.properties.description?.toLowerCase().includes('network') ||
op.properties.description?.toLowerCase().includes('vnet'))
) || [];
// Find active operations
const activeOp = networkOps.find((op: any) => op.properties.status === 'InProgress');
if (activeOp) {
// Show progress details if available
const currentStatus = activeOp.properties.status;
const progress = activeOp.properties.percentComplete || 0;
const opId = activeOp.name;
// Only update the terminal if something has changed
if (currentStatus !== lastStatus || progress !== lastProgress || opId !== lastOpId) {
// Create a progress bar
const progressBarLength = 20;
const filledLength = Math.floor(progress / 100 * progressBarLength);
const progressBar = "█".repeat(filledLength) + "░".repeat(progressBarLength - filledLength);
terminal.writeln(`\x1B[34m [${progressBar}] ${progress}% - ${currentStatus}\x1B[0m`);
lastStatus = currentStatus;
lastProgress = progress;
lastOpId = opId;
}
} else if (networkOps.length > 0) {
// If there are completed operations, show their status
const lastCompletedOp = networkOps[0];
if (lastCompletedOp.properties.status !== lastStatus) {
terminal.writeln(terminalLog.progress("Operation status", lastCompletedOp.properties.status));
lastStatus = lastCompletedOp.properties.status;
}
}
retryCount++;
await wait(POLLING_INTERVAL_MS);
}
if (!updateComplete) {
terminal.writeln(terminalLog.warning("Database update timed out. Please check the Azure portal."));
}
}
/**
* Configures a new VNet for CloudShell
*/
public static async configureCloudShellVNet(terminal: Terminal, resolvedRegion: string): Promise<VnetSettings> {
// Use professional and shorter names for resources
const randomSuffix = Math.floor(10000 + Math.random() * 90000);
const subnetName = `cloudshell-subnet-${randomSuffix}`;
const vnetName = `cloudshell-vnet-${randomSuffix}`;
const networkProfileName = `cloudshell-network-profile-${randomSuffix}`;
const relayName = `cloudshell-relay-${randomSuffix}`;
terminal.writeln(terminalLog.header("Network Resource Configuration"));
const azureContainerInstanceOID = await askQuestion(
terminal,
"Enter Azure Container Instance OID (Refer. https://learn.microsoft.com/en-us/azure/cloud-shell/vnet/deployment#get-the-azure-container-instance-id)",
DEFAULT_CONTAINER_INSTANCE_OID
);
const vNetSubscriptionId = await askQuestion(
terminal,
"Enter Virtual Network Subscription ID",
userContext.subscriptionId
);
const vNetResourceGroup = await askQuestion(
terminal,
"Enter Virtual Network Resource Group",
userContext.resourceGroup
);
// Step 1: Create VNet with Subnet
terminal.writeln(terminalLog.header("Deploying Network Resources"));
const vNetConfigPayload = await this.createCloudShellVnet(
resolvedRegion,
subnetName,
terminal,
vnetName,
vNetSubscriptionId,
vNetResourceGroup
);
// Step 2: Create Network Profile
await this.createNetworkProfileWithVnet(
vNetSubscriptionId,
vNetResourceGroup,
vnetName,
subnetName,
resolvedRegion,
terminal,
networkProfileName
);
// Step 3: Create Network Relay
await this.createNetworkRelay(
resolvedRegion,
terminal,
relayName,
vNetSubscriptionId,
vNetResourceGroup
);
// Step 4: Assign Roles
terminal.writeln(terminalLog.header("Configuring Security Permissions"));
await this.assignRoleToNetworkProfile(
azureContainerInstanceOID,
vNetSubscriptionId,
terminal,
networkProfileName,
vNetResourceGroup
);
await this.assignRoleToRelay(
azureContainerInstanceOID,
vNetSubscriptionId,
terminal,
relayName,
vNetResourceGroup
);
// Step 5: Create and return VNet settings
const networkProfileResourceId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName.replace(/[\n\r]/g, "")}`;
const relayResourceId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName.replace(/[\n\r]/g, "")}`;
terminal.writeln(terminalLog.success("Network configuration complete"));
return {
networkProfileResourceId,
relayNamespaceResourceId: relayResourceId,
location: vNetConfigPayload.location
};
}
/**
* Creates a VNet for CloudShell
*/
private static async createCloudShellVnet(
resolvedRegion: string,
subnetName: string,
terminal: Terminal,
vnetName: string,
vNetSubscriptionId: string,
vNetResourceGroup: string
): Promise<any> {
const vNetConfigPayload = {
location: resolvedRegion,
properties: {
addressSpace: {
addressPrefixes: [DEFAULT_VNET_ADDRESS_PREFIX],
},
subnets: [
{
name: subnetName,
properties: {
addressPrefix: DEFAULT_SUBNET_ADDRESS_PREFIX,
delegations: [
{
name: "CloudShellDelegation",
properties: {
serviceName: "Microsoft.ContainerInstance/containerGroups"
}
}
],
},
},
],
},
};
terminal.writeln(terminalLog.vnet(`Creating VNet: ${vnetName}`));
let vNetResponse = await updateVnet<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}`,
vNetConfigPayload
);
while (vNetResponse?.properties?.provisioningState !== "Succeeded") {
vNetResponse = await getVnet<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}`
);
const vNetState = vNetResponse?.properties?.provisioningState;
if (vNetState !== "Succeeded" && vNetState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("VNet deployment", vNetState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("VNet created successfully"));
return vNetConfigPayload;
}
/**
* Creates a Network Profile for CloudShell
*/
private static async createNetworkProfileWithVnet(
vNetSubscriptionId: string,
vNetResourceGroup: string,
vnetName: string,
subnetName: string,
resolvedRegion: string,
terminal: Terminal,
networkProfileName: string
): Promise<void> {
const subnetId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}/subnets/${subnetName}`;
const createNetworkProfilePayload = {
location: resolvedRegion,
properties: {
containerNetworkInterfaceConfigurations: [
{
name: 'defaultContainerNicConfig',
properties: {
ipConfigurations: [
{
name: 'defaultContainerIpConfig',
properties: {
subnet: {
id: subnetId,
}
}
}
]
}
}
]
}
};
terminal.writeln(terminalLog.vnet("Creating Network Profile"));
let networkProfileResponse = await createNetworkProfile<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}`,
createNetworkProfilePayload
);
while (networkProfileResponse?.properties?.provisioningState !== "Succeeded") {
networkProfileResponse = await getNetworkProfileInfo<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}`
);
const networkProfileState = networkProfileResponse?.properties?.provisioningState;
if (networkProfileState !== "Succeeded" && networkProfileState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("Network Profile", networkProfileState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("Network Profile created successfully"));
}
/**
* Creates a Network Relay for CloudShell
*/
private static async createNetworkRelay(
resolvedRegion: string,
terminal: Terminal,
relayName: string,
vNetSubscriptionId: string,
vNetResourceGroup: string
): Promise<void> {
const relayPayload = {
location: resolvedRegion,
sku: {
name: STANDARD_SKU,
tier: STANDARD_SKU,
}
};
terminal.writeln(terminalLog.vnet("Creating Relay Namespace"));
let relayResponse = await createRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}`,
relayPayload
);
while (relayResponse?.properties?.provisioningState !== "Succeeded") {
relayResponse = await getRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}`
);
const relayState = relayResponse?.properties?.provisioningState;
if (relayState !== "Succeeded" && relayState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("Relay Namespace", relayState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("Relay Namespace created successfully"));
}
/**
* Assigns a role to a Network Profile
*/
private static async assignRoleToNetworkProfile(
azureContainerInstanceOID: string,
vNetSubscriptionId: string,
terminal: Terminal,
networkProfileName: string,
vNetResourceGroup: string
): Promise<void> {
const nfRoleName = uuidv4();
const networkProfileRoleAssignmentPayload = {
properties: {
principalId: azureContainerInstanceOID,
roleDefinitionId: `/subscriptions/${vNetSubscriptionId}/providers/Microsoft.Authorization/roleDefinitions/4d97b98b-1d4f-4787-a291-c67834d212e7`
}
};
terminal.writeln(terminalLog.info("Assigning permissions to Network Profile"));
await createRoleOnNetworkProfile<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}/providers/Microsoft.Authorization/roleAssignments/${nfRoleName}`,
networkProfileRoleAssignmentPayload
);
terminal.writeln(terminalLog.success("Network Profile permissions assigned"));
}
/**
* Assigns a role to a Network Relay
*/
private static async assignRoleToRelay(
azureContainerInstanceOID: string,
vNetSubscriptionId: string,
terminal: Terminal,
relayName: string,
vNetResourceGroup: string
): Promise<void> {
const relayRoleName = uuidv4();
const relayRoleAssignmentPayload = {
properties: {
principalId: azureContainerInstanceOID,
roleDefinitionId: `/subscriptions/${vNetSubscriptionId}/providers/Microsoft.Authorization/roleDefinitions/b24988ac-6180-42a0-ab88-20f7382dd24c`,
}
};
terminal.writeln(terminalLog.info("Assigning permissions to Relay Namespace"));
await createRoleOnRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}/providers/Microsoft.Authorization/roleAssignments/${relayRoleName}`,
relayRoleAssignmentPayload
);
terminal.writeln(terminalLog.success("Relay Namespace permissions assigned"));
}
}

View File

@@ -0,0 +1,80 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Cassandra shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class CassandraShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Cassandra;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "Cassandra";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.cassandraEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if cqlsh is installed; if not, proceed with installation
"if ! command -v cqlsh &> /dev/null; then echo '⚠️ cqlsh not found. Installing...'; fi",
// 3. Download Cassandra if not installed
"if ! command -v cqlsh &> /dev/null; then curl -LO https://archive.apache.org/dist/cassandra/5.0.3/apache-cassandra-5.0.3-bin.tar.gz; fi",
// 4. Extract Cassandra package if not installed
"if ! command -v cqlsh &> /dev/null; then tar -xvzf apache-cassandra-5.0.3-bin.tar.gz; fi",
// 5. Move Cassandra binaries if not installed
"if ! command -v cqlsh &> /dev/null; then mkdir -p ~/cassandra && mv apache-cassandra-5.0.3/* ~/cassandra/; fi",
// 6. Add Cassandra to PATH if not installed
"if ! command -v cqlsh &> /dev/null; then echo 'export PATH=$HOME/cassandra/bin:$PATH' >> ~/.bashrc; fi",
// 7. Set environment variables for SSL
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VERSION=TLSv1_2' >> ~/.bashrc; fi",
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VALIDATE=false' >> ~/.bashrc; fi",
// 8. Source .bashrc to update PATH (even if cqlsh was already installed)
"source ~/.bashrc",
// 9. Verify cqlsh installation
"cqlsh --version",
// 10. Login to Cassandra
`cqlsh ${config.host} 10350 -u ${config.name} -p ${config.password} --ssl --protocol-version=4`
];
}
}

View File

@@ -0,0 +1,77 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Mongo shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class MongoShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Mongo;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "MongoDB";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.mongoEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`mongosh --host ${config.host} --port 10255 --username ${config.name} --password ${config.password} --tls --tlsAllowInvalidCertificates`
];
}
}

View File

@@ -0,0 +1,82 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* PostgreSQL shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class PostgresShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Postgres;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "PostgreSQL";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.postgresqlEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if psql is installed; if not, proceed with installation
"if ! command -v psql &> /dev/null; then echo '⚠️ psql not found. Installing...'; fi",
// 3. Download PostgreSQL if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.postgresql.org/pub/source/v15.2/postgresql-15.2.tar.bz2; fi",
// 4. Extract PostgreSQL package if not installed
"if ! command -v psql &> /dev/null; then tar -xvjf postgresql-15.2.tar.bz2; fi",
// 5. Create a directory for PostgreSQL installation if not installed
"if ! command -v psql &> /dev/null; then mkdir -p ~/pgsql; fi",
// 6. Download readline (dependency for PostgreSQL) if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.gnu.org/gnu/readline/readline-8.1.tar.gz; fi",
// 7. Extract readline package if not installed
"if ! command -v psql &> /dev/null; then tar -xvzf readline-8.1.tar.gz; fi",
// 8. Configure readline if not installed
"if ! command -v psql &> /dev/null; then cd readline-8.1 && ./configure --prefix=$HOME/pgsql; fi",
// 9. Add PostgreSQL to PATH if not installed
"if ! command -v psql &> /dev/null; then echo 'export PATH=$HOME/pgsql/bin:$PATH' >> ~/.bashrc; fi",
// 10. Source .bashrc to update PATH (even if psql was already installed)
"source ~/.bashrc",
// 11. Verify PostgreSQL installation
"psql --version",
`psql 'read -p "Enter Database Name: " dbname && read -p "Enter Username: " username && host=${config.endpoint} port=5432 dbname=$dbname user=$username sslmode=require'`
];
}
}

View File

@@ -0,0 +1,57 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Factory for creating shell type handlers
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { CassandraShellHandler } from "./CassandraShellHandler";
import { MongoShellHandler } from "./MongoShellHandler";
import { PostgresShellHandler } from "./PostgresShellHandler";
import { VCoreMongoShellHandler } from "./VCoreMongoShellHandler";
export interface ShellTypeConfig {
getShellName(): string;
getInitialCommands(): Promise<string>;
configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}>;
}
export class ShellTypeHandler {
/**
* Gets the appropriate handler for the given shell type
*/
public static getHandler(shellType: TerminalKind): ShellTypeConfig {
switch (shellType) {
case TerminalKind.Postgres:
return new PostgresShellHandler();
case TerminalKind.Mongo:
return new MongoShellHandler();
case TerminalKind.VCoreMongo:
return new VCoreMongoShellHandler();
case TerminalKind.Cassandra:
return new CassandraShellHandler();
default:
throw new Error(`Unsupported shell type: ${shellType}`);
}
}
/**
* Gets the display name for a shell type
*/
public static getShellNameForDisplay(terminalKind: TerminalKind): string {
switch (terminalKind) {
case TerminalKind.Postgres:
return "PostgreSQL";
case TerminalKind.Mongo:
case TerminalKind.VCoreMongo:
return "MongoDB";
case TerminalKind.Cassandra:
return "Cassandra";
default:
return "";
}
}
}

View File

@@ -0,0 +1,78 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* VCore MongoDB shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class VCoreMongoShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.VCoreMongo;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "MongoDB VCore";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.vcoreMongoEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
// VCore MongoDB uses private endpoints
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`read -p "Enter username: " username && mongosh "mongodb+srv://$username:@${config.endpoint}/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000" --tls --tlsAllowInvalidCertificates`
];
}
}

View File

@@ -0,0 +1,123 @@
import { IDisposable, ITerminalAddon, Terminal } from 'xterm';
interface IAttachOptions {
bidirectional?: boolean;
}
export class AttachAddon implements ITerminalAddon {
private _socket: WebSocket;
private _bidirectional: boolean;
private _disposables: IDisposable[] = [];
private _socketData: string;
constructor(socket: WebSocket, options?: IAttachOptions) {
this._socket = socket;
// always set binary type to arraybuffer, we do not handle blobs
this._socket.binaryType = 'arraybuffer';
this._bidirectional = !(options && options.bidirectional === false);
this._socketData = '';
}
public activate(terminal: Terminal): void {
this._disposables.push(
addSocketListener(this._socket, 'message', ev => {
let data: ArrayBuffer | string = ev.data;
const startStatusJson = 'ie_us';
const endStatusJson = 'ie_ue';
if (typeof data === 'object') {
const enc = new TextDecoder("utf-8");
data = enc.decode(ev.data as any);
}
// for example of json object look in TerminalHelper in the socket.onMessage
if (data.includes(startStatusJson) && data.includes(endStatusJson)) {
// process as one line
const statusData = data.split(startStatusJson)[1].split(endStatusJson)[0];
data = data.replace(statusData, '');
data = data.replace(startStatusJson, '');
data = data.replace(endStatusJson, '');
} else if (data.includes(startStatusJson)) {
// check for start
const partialStatusData = data.split(startStatusJson)[1];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(startStatusJson, '');
} else if (data.includes(endStatusJson)) {
// check for end and process the command
const partialStatusData = data.split(endStatusJson)[0];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(endStatusJson, '');
this._socketData = '';
} else if (this._socketData.length > 0) {
// check if the line is all data then just concatenate
this._socketData += data;
data = '';
}
terminal.write(data);
})
);
if (this._bidirectional) {
this._disposables.push(terminal.onData(data => this._sendData(data)));
this._disposables.push(terminal.onBinary(data => this._sendBinary(data)));
}
this._disposables.push(addSocketListener(this._socket, 'close', () => this.dispose()));
this._disposables.push(addSocketListener(this._socket, 'error', () => this.dispose()));
}
public dispose(): void {
for (const d of this._disposables) {
d.dispose();
}
}
private _sendData(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
this._socket.send(data);
}
private _sendBinary(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
const buffer = new Uint8Array(data.length);
for (let i = 0; i < data.length; ++i) {
buffer[i] = data.charCodeAt(i) & 255;
}
this._socket.send(buffer);
}
private _checkOpenSocket(): boolean {
switch (this._socket.readyState) {
case WebSocket.OPEN:
return true;
case WebSocket.CONNECTING:
throw new Error('Attach addon was loaded before socket was open');
case WebSocket.CLOSING:
return false;
case WebSocket.CLOSED:
throw new Error('Attach addon socket is closed');
default:
throw new Error('Unexpected socket state');
}
}
}
function addSocketListener<K extends keyof WebSocketEventMap>(socket: WebSocket, type: K, handler: (this: WebSocket, ev: WebSocketEventMap[K]) => any): IDisposable {
socket.addEventListener(type, handler);
return {
dispose: () => {
if (!handler) {
// Already disposed
return;
}
socket.removeEventListener(type, handler);
}
};
}

View File

@@ -0,0 +1,84 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Common utility functions for CloudShell
*/
import { Terminal } from "xterm";
import { terminalLog } from "./LogFormatter";
/**
* Utility function to wait for a specified duration
*/
export const wait = (ms: number): Promise<void> => new Promise(resolve => setTimeout(resolve, ms));
/**
* Utility function to ask a question in the terminal
*/
export const askQuestion = (terminal: Terminal, question: string, defaultAnswer: string = ""): Promise<string> => {
return new Promise<string>((resolve) => {
const prompt = terminalLog.prompt(`${question} (${defaultAnswer}): `);
terminal.writeln(prompt);
terminal.focus();
let response = "";
const dataListener = terminal.onData((data: string) => {
if (data === "\r") { // Enter key pressed
terminal.writeln(""); // Move to a new line
dataListener.dispose();
if (response.trim() === "") {
response = defaultAnswer; // Use default answer if no input
}
return resolve(response.trim());
} else if (data === "\u007F" || data === "\b") { // Handle backspace
if (response.length > 0) {
response = response.slice(0, -1);
terminal.write("\x1B[D \x1B[D"); // Move cursor back, clear character
}
} else if (data.charCodeAt(0) >= 32) { // Ignore control characters
response += data;
terminal.write(data); // Display typed characters
}
});
// Prevent cursor movement beyond the prompt
terminal.onKey(({ domEvent }: { domEvent: KeyboardEvent }) => {
if (domEvent.key === "ArrowLeft" && response.length === 0) {
domEvent.preventDefault(); // Stop moving left beyond the question
}
});
});
};
/**
* Utility function to ask for yes/no confirmation
*/
export const askConfirmation = async (terminal: Terminal, question: string): Promise<boolean> => {
terminal.writeln("");
terminal.writeln(terminalLog.prompt(`${question} (Y/N)`));
terminal.focus();
return new Promise<boolean>((resolve) => {
const keyListener = terminal.onKey(({ key }: { key: string }) => {
keyListener.dispose();
terminal.writeln("");
if (key.toLowerCase() === 'y') {
resolve(true);
} else {
resolve(false);
}
});
});
};
/**
* Extract host from a URL
*/
export const getHostFromUrl = (url: string): string => {
try {
const urlObj = new URL(url);
return urlObj.hostname;
} catch (error) {
console.error("Invalid URL:", error);
return "";
}
};

View File

@@ -0,0 +1,28 @@
/**
* Standardized terminal logging functions for consistent formatting
*/
export const terminalLog = {
// Section headers
header: (message: string) => `\n\x1B[1;34m┌─ ${message} ${"─".repeat(Math.max(45 - message.length, 0))}\x1B[0m`,
subheader: (message: string) => `\x1B[1;36m├ ${message}\x1B[0m`,
sectionEnd: () => `\x1B[1;34m└${"─".repeat(50)}\x1B[0m\n`,
// Status messages
success: (message: string) => `\x1B[32m✓ ${message}\x1B[0m`,
warning: (message: string) => `\x1B[33m⚠ ${message}\x1B[0m`,
error: (message: string) => `\x1B[31m✗ ${message}\x1B[0m`,
info: (message: string) => `\x1B[34m${message}\x1B[0m`,
// Resource information
database: (message: string) => `\x1B[35m🔶 Database: ${message}\x1B[0m`,
vnet: (message: string) => `\x1B[36m🔷 Network: ${message}\x1B[0m`,
cloudshell: (message: string) => `\x1B[32m🔷 CloudShell: ${message}\x1B[0m`,
// Data formatting
item: (label: string, value: string) => `${label}: \x1B[32m${value}\x1B[0m`,
progress: (operation: string, status: string) => `\x1B[34m${operation}: \x1B[36m${status}\x1B[0m`,
// User interaction
prompt: (message: string) => `\x1B[1;37m${message}\x1B[0m`,
separator: () => `\x1B[30;1m${"─".repeat(50)}\x1B[0m`
};

View File

@@ -1150,27 +1150,16 @@ export const DocumentsTabComponent: React.FunctionComponent<IDocumentsTabCompone
deletePromise = _bulkDeleteNoSqlDocuments(_collection, toDeleteDocumentIds); deletePromise = _bulkDeleteNoSqlDocuments(_collection, toDeleteDocumentIds);
} }
} else { } else {
if (isMongoBulkDeleteDisabled) { deletePromise = MongoProxyClient.deleteDocuments(
// TODO: Once new mongo proxy is available for all users, remove the call for MongoProxyClient.deleteDocument(). _collection.databaseId,
// MongoProxyClient.deleteDocuments() should be called for all users. _collection as ViewModels.Collection,
deletePromise = MongoProxyClient.deleteDocument( toDeleteDocumentIds,
_collection.databaseId, ).then(({ deletedCount, isAcknowledged }) => {
_collection as ViewModels.Collection, if (deletedCount === toDeleteDocumentIds.length && isAcknowledged) {
toDeleteDocumentIds[0], return toDeleteDocumentIds;
).then(() => [toDeleteDocumentIds[0]]); }
// ---------------------------------------------------------------------------------------------------- throw new Error(`Delete failed with deletedCount: ${deletedCount} and isAcknowledged: ${isAcknowledged}`);
} else { });
deletePromise = MongoProxyClient.deleteDocuments(
_collection.databaseId,
_collection as ViewModels.Collection,
toDeleteDocumentIds,
).then(({ deletedCount, isAcknowledged }) => {
if (deletedCount === toDeleteDocumentIds.length && isAcknowledged) {
return toDeleteDocumentIds;
}
throw new Error(`Delete failed with deletedCount: ${deletedCount} and isAcknowledged: ${isAcknowledged}`);
});
}
} }
return deletePromise return deletePromise
@@ -2054,11 +2043,8 @@ export const DocumentsTabComponent: React.FunctionComponent<IDocumentsTabCompone
} }
}, [prevSelectedColumnIds, refreshDocumentsGrid, selectedColumnIds]); }, [prevSelectedColumnIds, refreshDocumentsGrid, selectedColumnIds]);
// TODO: remove isMongoBulkDeleteDisabled when new mongo proxy is enabled for all users
// TODO: remove partitionKey.systemKey when JS SDK bug is fixed // TODO: remove partitionKey.systemKey when JS SDK bug is fixed
const isMongoBulkDeleteDisabled = !MongoProxyClient.useMongoProxyEndpoint(Constants.MongoProxyApi.BulkDelete); const isBulkDeleteDisabled = partitionKey.systemKey && !isPreferredApiMongoDB;
const isBulkDeleteDisabled =
(partitionKey.systemKey && !isPreferredApiMongoDB) || (isPreferredApiMongoDB && isMongoBulkDeleteDisabled);
// ------------------------------------------------------- // -------------------------------------------------------
const getFilterChoices = (): InputDatalistDropdownOptionSection[] => { const getFilterChoices = (): InputDatalistDropdownOptionSection[] => {

View File

@@ -1,6 +1,4 @@
import { useMongoProxyEndpoint } from "Common/MongoProxyClient";
import React, { Component } from "react"; import React, { Component } from "react";
import * as Constants from "../../../Common/Constants";
import { configContext } from "../../../ConfigContext"; import { configContext } from "../../../ConfigContext";
import * as ViewModels from "../../../Contracts/ViewModels"; import * as ViewModels from "../../../Contracts/ViewModels";
import { Action, ActionModifiers } from "../../../Shared/Telemetry/TelemetryConstants"; import { Action, ActionModifiers } from "../../../Shared/Telemetry/TelemetryConstants";
@@ -50,15 +48,13 @@ export default class MongoShellTabComponent extends Component<
IMongoShellTabComponentStates IMongoShellTabComponentStates
> { > {
private _logTraces: Map<string, number>; private _logTraces: Map<string, number>;
private _useMongoProxyEndpoint: boolean;
constructor(props: IMongoShellTabComponentProps) { constructor(props: IMongoShellTabComponentProps) {
super(props); super(props);
this._logTraces = new Map(); this._logTraces = new Map();
this._useMongoProxyEndpoint = useMongoProxyEndpoint(Constants.MongoProxyApi.LegacyMongoShell);
this.state = { this.state = {
url: getMongoShellUrl(this._useMongoProxyEndpoint), url: getMongoShellUrl(),
}; };
props.onMongoShellTabAccessor({ props.onMongoShellTabAccessor({
@@ -113,17 +109,9 @@ export default class MongoShellTabComponent extends Component<
const resourceId = databaseAccount?.id; const resourceId = databaseAccount?.id;
const accountName = databaseAccount?.name; const accountName = databaseAccount?.name;
const documentEndpoint = databaseAccount?.properties.mongoEndpoint || databaseAccount?.properties.documentEndpoint; const documentEndpoint = databaseAccount?.properties.mongoEndpoint || databaseAccount?.properties.documentEndpoint;
const mongoEndpoint =
documentEndpoint.substr(
Constants.MongoDBAccounts.protocol.length + 3,
documentEndpoint.length -
(Constants.MongoDBAccounts.protocol.length + 2 + Constants.MongoDBAccounts.defaultPort.length),
) + Constants.MongoDBAccounts.defaultPort.toString();
const databaseId = this.props.collection.databaseId; const databaseId = this.props.collection.databaseId;
const collectionId = this.props.collection.id(); const collectionId = this.props.collection.id();
const apiEndpoint = this._useMongoProxyEndpoint const apiEndpoint = configContext.MONGO_PROXY_ENDPOINT;
? configContext.MONGO_PROXY_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const encryptedAuthToken: string = userContext.accessToken; const encryptedAuthToken: string = userContext.accessToken;
shellIframe.contentWindow.postMessage( shellIframe.contentWindow.postMessage(
@@ -132,7 +120,7 @@ export default class MongoShellTabComponent extends Component<
data: { data: {
resourceId: resourceId, resourceId: resourceId,
accountName: accountName, accountName: accountName,
mongoEndpoint: this._useMongoProxyEndpoint ? documentEndpoint : mongoEndpoint, mongoEndpoint: documentEndpoint,
authorization: authorization, authorization: authorization,
databaseId: databaseId, databaseId: databaseId,
collectionId: collectionId, collectionId: collectionId,

View File

@@ -2,8 +2,6 @@ import { Platform, resetConfigContext, updateConfigContext } from "../../../Conf
import { updateUserContext, userContext } from "../../../UserContext"; import { updateUserContext, userContext } from "../../../UserContext";
import { getMongoShellUrl } from "./getMongoShellUrl"; import { getMongoShellUrl } from "./getMongoShellUrl";
const mongoBackendEndpoint = "https://localhost:1234";
describe("getMongoShellUrl", () => { describe("getMongoShellUrl", () => {
let queryString = ""; let queryString = "";
@@ -11,7 +9,6 @@ describe("getMongoShellUrl", () => {
resetConfigContext(); resetConfigContext();
updateConfigContext({ updateConfigContext({
BACKEND_ENDPOINT: mongoBackendEndpoint,
platform: Platform.Hosted, platform: Platform.Hosted,
}); });
@@ -37,12 +34,7 @@ describe("getMongoShellUrl", () => {
queryString = `resourceId=${userContext.databaseAccount.id}&accountName=${userContext.databaseAccount.name}&mongoEndpoint=${userContext.databaseAccount.properties.documentEndpoint}`; queryString = `resourceId=${userContext.databaseAccount.id}&accountName=${userContext.databaseAccount.name}&mongoEndpoint=${userContext.databaseAccount.properties.documentEndpoint}`;
}); });
it("should return /indexv2.html by default", () => { it("should return /index.html by default", () => {
expect(getMongoShellUrl().toString()).toContain(`/indexv2.html?${queryString}`); expect(getMongoShellUrl().toString()).toContain(`/index.html?${queryString}`);
});
it("should return /index.html when useMongoProxyEndpoint is true", () => {
const useMongoProxyEndpoint: boolean = true;
expect(getMongoShellUrl(useMongoProxyEndpoint).toString()).toContain(`/index.html?${queryString}`);
}); });
}); });

View File

@@ -1,11 +1,11 @@
import { userContext } from "../../../UserContext"; import { userContext } from "../../../UserContext";
export function getMongoShellUrl(useMongoProxyEndpoint?: boolean): string { export function getMongoShellUrl(): string {
const { databaseAccount: account } = userContext; const { databaseAccount: account } = userContext;
const resourceId = account?.id; const resourceId = account?.id;
const accountName = account?.name; const accountName = account?.name;
const mongoEndpoint = account?.properties?.mongoEndpoint || account?.properties?.documentEndpoint; const mongoEndpoint = account?.properties?.mongoEndpoint || account?.properties?.documentEndpoint;
const queryString = `resourceId=${resourceId}&accountName=${accountName}&mongoEndpoint=${mongoEndpoint}`; const queryString = `resourceId=${resourceId}&accountName=${accountName}&mongoEndpoint=${mongoEndpoint}`;
return useMongoProxyEndpoint ? `/mongoshell/index.html?${queryString}` : `/mongoshell/indexv2.html?${queryString}`; return `/mongoshell/index.html?${queryString}`;
} }

View File

@@ -1,6 +1,6 @@
/* eslint-disable @typescript-eslint/no-explicit-any */ /* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable no-console */ /* eslint-disable no-console */
import { FeedOptions } from "@azure/cosmos"; import { FeedOptions, QueryOperationOptions } from "@azure/cosmos";
import QueryError, { createMonacoErrorLocationResolver, createMonacoMarkersForQueryErrors } from "Common/QueryError"; import QueryError, { createMonacoErrorLocationResolver, createMonacoMarkersForQueryErrors } from "Common/QueryError";
import { SplitterDirection } from "Common/Splitter"; import { SplitterDirection } from "Common/Splitter";
import { Platform, configContext } from "ConfigContext"; import { Platform, configContext } from "ConfigContext";
@@ -18,7 +18,7 @@ import { CosmosFluentProvider } from "Explorer/Theme/ThemeUtil";
import { useSelectedNode } from "Explorer/useSelectedNode"; import { useSelectedNode } from "Explorer/useSelectedNode";
import { KeyboardAction } from "KeyboardShortcuts"; import { KeyboardAction } from "KeyboardShortcuts";
import { QueryConstants } from "Shared/Constants"; import { QueryConstants } from "Shared/Constants";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility"; import { LocalStorageUtility, StorageKey, getRUThreshold, ruThresholdEnabled } from "Shared/StorageUtility";
import { Action } from "Shared/Telemetry/TelemetryConstants"; import { Action } from "Shared/Telemetry/TelemetryConstants";
import { Allotment } from "allotment"; import { Allotment } from "allotment";
import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot"; import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot";
@@ -368,19 +368,20 @@ class QueryTabComponentImpl extends React.Component<QueryTabComponentImplProps,
isExecutionError: false, isExecutionError: false,
}); });
// let queryOperationOptions: QueryOperationOptions; let queryOperationOptions: QueryOperationOptions;
// if (userContext.apiType === "SQL" && ruThresholdEnabled()) { if (userContext.apiType === "SQL" && ruThresholdEnabled()) {
// const ruThreshold: number = getRUThreshold(); const ruThreshold: number = getRUThreshold();
// queryOperationOptions = { queryOperationOptions = {
// ruCapPerOperation: ruThreshold, ruCapPerOperation: ruThreshold,
// } as QueryOperationOptions; } as QueryOperationOptions;
// } }
const queryDocuments = async (firstItemIndex: number) => const queryDocuments = async (firstItemIndex: number) =>
await queryDocumentsPage( await queryDocumentsPage(
this.props.collection && this.props.collection.id(), this.props.collection && this.props.collection.id(),
this._iterator, this._iterator,
firstItemIndex, firstItemIndex,
// queryOperationOptions, queryOperationOptions,
); );
this.props.tabsBaseInstance.isExecuting(true); this.props.tabsBaseInstance.isExecuting(true);
this.setState({ this.setState({

View File

@@ -1,5 +1,6 @@
import { configContext } from "ConfigContext"; import { configContext } from "ConfigContext";
import * as DataModels from "Contracts/DataModels"; import * as DataModels from "Contracts/DataModels";
import * as ViewModels from "Contracts/ViewModels";
import { userContext } from "UserContext"; import { userContext } from "UserContext";
import { armRequest } from "Utils/arm/request"; import { armRequest } from "Utils/arm/request";
@@ -10,16 +11,8 @@ export async function checkFirewallRules(
setMessageFunc?: (message: string) => void, setMessageFunc?: (message: string) => void,
message?: string, message?: string,
): Promise<void> { ): Promise<void> {
const firewallRulesUri = `${userContext.databaseAccount.id}/firewallRules`;
// eslint-disable-next-line @typescript-eslint/no-explicit-any const isEnabled = await callFirewallAPis(apiVersion, firewallRulesPredicate);
const response: any = await armRequest({
host: configContext.ARM_ENDPOINT,
path: firewallRulesUri,
method: "GET",
apiVersion: apiVersion,
});
const firewallRules: DataModels.FirewallRule[] = response?.data?.value || response?.value || [];
const isEnabled = firewallRules.some(firewallRulesPredicate);
if (isAllPublicIPAddressesEnabled) { if (isAllPublicIPAddressesEnabled) {
isAllPublicIPAddressesEnabled(isEnabled); isAllPublicIPAddressesEnabled(isEnabled);
@@ -42,3 +35,89 @@ export async function checkFirewallRules(
); );
} }
} }
export async function callFirewallAPis(
apiVersion: string,
firewallRulesPredicate: (rule: DataModels.FirewallRule) => unknown):
Promise<boolean> {
const firewallRulesUri = `${userContext.databaseAccount.id}/firewallRules`;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const response: any = await armRequest({
host: configContext.ARM_ENDPOINT,
path: firewallRulesUri,
method: "GET",
apiVersion: apiVersion,
});
const firewallRules: DataModels.FirewallRule[] = response?.data?.value || response?.value || [];
const isEnabled = firewallRules.some(firewallRulesPredicate);
return isEnabled;
}
export async function checkNetworkRules(kind: ViewModels.TerminalKind, isPublicAccessEnabledFlag: ko.Observable<boolean> | React.Dispatch<React.SetStateAction<boolean>>): Promise<void> {
if (kind === ViewModels.TerminalKind.Postgres) {
await checkFirewallRules(
"2022-11-08",
(rule) => rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255",
isPublicAccessEnabledFlag,
);
}
if (kind === ViewModels.TerminalKind.VCoreMongo) {
await checkFirewallRules(
"2023-03-01-preview",
(rule) =>
rule.name.startsWith("AllowAllAzureServicesAndResourcesWithinAzureIps") ||
(rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255"),
isPublicAccessEnabledFlag,
);
}
}
export async function IsPublicAccessAvailable(kind: ViewModels.TerminalKind): Promise<boolean> {
if (kind === ViewModels.TerminalKind.Postgres) {
return await callFirewallAPis(
"2022-11-08",
(rule) => rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255"
);
}
if (kind === ViewModels.TerminalKind.VCoreMongo) {
return await callFirewallAPis(
"2023-03-01-preview",
(rule) =>
rule.name.startsWith("AllowAllAzureServicesAndResourcesWithinAzureIps") ||
(rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255")
);
}
return !hasDatabaseNetworkRestrictions();
}
/**
* Checks if the database account has network restrictions
*/
const hasDatabaseNetworkRestrictions = (): boolean => {
return hasVNetRestrictions() || hasFirewallRestrictions() || hasPrivateEndpointsRestrictions();
};
/**
* Checks if the database account has Private Endpoint restrictions
*/
export const hasPrivateEndpointsRestrictions = (): boolean => {
return userContext.databaseAccount.properties.privateEndpointConnections && userContext.databaseAccount.properties.privateEndpointConnections.length > 0;
};
/**
* Checks if the database account has Firewall restrictions
*/
export const hasFirewallRestrictions = (): boolean => {
return userContext.databaseAccount.properties.isVirtualNetworkFilterEnabled;;
};
/**
* Checks if the database account has VNet restrictions
*/
export const hasVNetRestrictions = (): boolean => {
return userContext.databaseAccount.properties.virtualNetworkRules && userContext.databaseAccount.properties.virtualNetworkRules.length > 0
};

View File

@@ -1,7 +1,3 @@
import { IMessageBarStyles, MessageBar, MessageBarType } from "@fluentui/react";
import { CassandraProxyEndpoints, MongoProxyEndpoints } from "Common/Constants";
import { configContext } from "ConfigContext";
import { IpRule } from "Contracts/DataModels";
import { CollectionTabKind } from "Contracts/ViewModels"; import { CollectionTabKind } from "Contracts/ViewModels";
import Explorer from "Explorer/Explorer"; import Explorer from "Explorer/Explorer";
import { useCommandBar } from "Explorer/Menus/CommandBar/CommandBarComponentAdapter"; import { useCommandBar } from "Explorer/Menus/CommandBar/CommandBarComponentAdapter";
@@ -12,10 +8,8 @@ import { PostgresConnectTab } from "Explorer/Tabs/PostgresConnectTab";
import { QuickstartTab } from "Explorer/Tabs/QuickstartTab"; import { QuickstartTab } from "Explorer/Tabs/QuickstartTab";
import { VcoreMongoConnectTab } from "Explorer/Tabs/VCoreMongoConnectTab"; import { VcoreMongoConnectTab } from "Explorer/Tabs/VCoreMongoConnectTab";
import { VcoreMongoQuickstartTab } from "Explorer/Tabs/VCoreMongoQuickstartTab"; import { VcoreMongoQuickstartTab } from "Explorer/Tabs/VCoreMongoQuickstartTab";
import { LayoutConstants } from "Explorer/Theme/ThemeUtil";
import { KeyboardAction, KeyboardActionGroup, useKeyboardActionGroup } from "KeyboardShortcuts"; import { KeyboardAction, KeyboardActionGroup, useKeyboardActionGroup } from "KeyboardShortcuts";
import { userContext } from "UserContext"; import { userContext } from "UserContext";
import { CassandraProxyOutboundIPs, MongoProxyOutboundIPs, PortalBackendIPs } from "Utils/EndpointUtils";
import { useTeachingBubble } from "hooks/useTeachingBubble"; import { useTeachingBubble } from "hooks/useTeachingBubble";
import ko from "knockout"; import ko from "knockout";
import React, { MutableRefObject, useEffect, useRef, useState } from "react"; import React, { MutableRefObject, useEffect, useRef, useState } from "react";
@@ -34,10 +28,6 @@ interface TabsProps {
export const Tabs = ({ explorer }: TabsProps): JSX.Element => { export const Tabs = ({ explorer }: TabsProps): JSX.Element => {
const { openedTabs, openedReactTabs, activeTab, activeReactTab } = useTabs(); const { openedTabs, openedReactTabs, activeTab, activeReactTab } = useTabs();
const [
showMongoAndCassandraProxiesNetworkSettingsWarningState,
setShowMongoAndCassandraProxiesNetworkSettingsWarningState,
] = useState<boolean>(showMongoAndCassandraProxiesNetworkSettingsWarning());
const setKeyboardHandlers = useKeyboardActionGroup(KeyboardActionGroup.TABS); const setKeyboardHandlers = useKeyboardActionGroup(KeyboardActionGroup.TABS);
useEffect(() => { useEffect(() => {
@@ -48,28 +38,8 @@ export const Tabs = ({ explorer }: TabsProps): JSX.Element => {
}); });
}, [setKeyboardHandlers]); }, [setKeyboardHandlers]);
const defaultMessageBarStyles: IMessageBarStyles = {
root: {
height: `${LayoutConstants.rowHeight}px`,
overflow: "hidden",
flexDirection: "row",
},
};
return ( return (
<div className="tabsManagerContainer"> <div className="tabsManagerContainer">
{showMongoAndCassandraProxiesNetworkSettingsWarningState && (
<MessageBar
messageBarType={MessageBarType.warning}
styles={defaultMessageBarStyles}
onDismiss={() => {
setShowMongoAndCassandraProxiesNetworkSettingsWarningState(false);
}}
>
{`We have migrated our middleware to new infrastructure. To avoid issues with Data Explorer access, please
re-enable "Allow access from Azure Portal" on the Networking blade for your account.`}
</MessageBar>
)}
<div className="nav-tabs-margin"> <div className="nav-tabs-margin">
<ul className="nav nav-tabs level navTabHeight" id="navTabs" role="tablist"> <ul className="nav nav-tabs level navTabHeight" id="navTabs" role="tablist">
{openedReactTabs.map((tab) => ( {openedReactTabs.map((tab) => (
@@ -203,7 +173,7 @@ const CloseButton = ({
onKeyPress={({ nativeEvent: e }) => (tab ? tab.onKeyPressClose(undefined, e) : onKeyPressReactTabClose(e, tabKind))} onKeyPress={({ nativeEvent: e }) => (tab ? tab.onKeyPressClose(undefined, e) : onKeyPressReactTabClose(e, tabKind))}
> >
<span className="tabIcon close-Icon"> <span className="tabIcon close-Icon">
<img src={errorIcon} title="Close" alt="Close" role="none" /> <img src={errorIcon} title="Close" alt="Close" aria-label="hidden" />
</span> </span>
</span> </span>
); );
@@ -314,57 +284,3 @@ const getReactTabContent = (activeReactTab: ReactTabKind, explorer: Explorer): J
throw Error(`Unsupported tab kind ${ReactTabKind[activeReactTab]}`); throw Error(`Unsupported tab kind ${ReactTabKind[activeReactTab]}`);
} }
}; };
const showMongoAndCassandraProxiesNetworkSettingsWarning = (): boolean => {
const ipRules: IpRule[] = userContext.databaseAccount?.properties?.ipRules;
if (
((userContext.apiType === "Mongo" && configContext.MONGO_PROXY_ENDPOINT !== MongoProxyEndpoints.Development) ||
(userContext.apiType === "Cassandra" &&
configContext.CASSANDRA_PROXY_ENDPOINT !== CassandraProxyEndpoints.Development)) &&
ipRules?.length
) {
const legacyPortalBackendIPs: string[] = PortalBackendIPs[configContext.BACKEND_ENDPOINT];
const ipAddressesFromIPRules: string[] = ipRules.map((ipRule) => ipRule.ipAddressOrRange);
const ipRulesIncludeLegacyPortalBackend: boolean = legacyPortalBackendIPs.every((legacyPortalBackendIP: string) =>
ipAddressesFromIPRules.includes(legacyPortalBackendIP),
);
if (!ipRulesIncludeLegacyPortalBackend) {
return false;
}
if (userContext.apiType === "Mongo") {
const isProdOrMpacMongoProxyEndpoint: boolean = [MongoProxyEndpoints.Mpac, MongoProxyEndpoints.Prod].includes(
configContext.MONGO_PROXY_ENDPOINT,
);
const mongoProxyOutboundIPs: string[] = isProdOrMpacMongoProxyEndpoint
? [...MongoProxyOutboundIPs[MongoProxyEndpoints.Mpac], ...MongoProxyOutboundIPs[MongoProxyEndpoints.Prod]]
: MongoProxyOutboundIPs[configContext.MONGO_PROXY_ENDPOINT];
const ipRulesIncludeMongoProxy: boolean = mongoProxyOutboundIPs.every((mongoProxyOutboundIP: string) =>
ipAddressesFromIPRules.includes(mongoProxyOutboundIP),
);
return !ipRulesIncludeMongoProxy;
} else if (userContext.apiType === "Cassandra") {
const isProdOrMpacCassandraProxyEndpoint: boolean = [
CassandraProxyEndpoints.Mpac,
CassandraProxyEndpoints.Prod,
].includes(configContext.CASSANDRA_PROXY_ENDPOINT);
const cassandraProxyOutboundIPs: string[] = isProdOrMpacCassandraProxyEndpoint
? [
...CassandraProxyOutboundIPs[CassandraProxyEndpoints.Mpac],
...CassandraProxyOutboundIPs[CassandraProxyEndpoints.Prod],
]
: CassandraProxyOutboundIPs[configContext.CASSANDRA_PROXY_ENDPOINT];
const ipRulesIncludeCassandraProxy: boolean = cassandraProxyOutboundIPs.every(
(cassandraProxyOutboundIP: string) => ipAddressesFromIPRules.includes(cassandraProxyOutboundIP),
);
return !ipRulesIncludeCassandraProxy;
}
}
return false;
};

View File

@@ -1,7 +1,7 @@
import { Spinner, SpinnerSize } from "@fluentui/react"; import { Spinner, SpinnerSize } from "@fluentui/react";
import { MessageTypes } from "Contracts/ExplorerContracts"; import { MessageTypes } from "Contracts/ExplorerContracts";
import { QuickstartFirewallNotification } from "Explorer/Quickstart/QuickstartFirewallNotification"; import { QuickstartFirewallNotification } from "Explorer/Quickstart/QuickstartFirewallNotification";
import { checkFirewallRules } from "Explorer/Tabs/Shared/CheckFirewallRules"; import { checkNetworkRules } from "Explorer/Tabs/Shared/CheckFirewallRules";
import * as ko from "knockout"; import * as ko from "knockout";
import * as React from "react"; import * as React from "react";
import FirewallRuleScreenshot from "../../../images/firewallRule.png"; import FirewallRuleScreenshot from "../../../images/firewallRule.png";
@@ -13,8 +13,10 @@ import { CommandButtonComponentProps } from "../Controls/CommandButton/CommandBu
import { NotebookTerminalComponent } from "../Controls/Notebook/NotebookTerminalComponent"; import { NotebookTerminalComponent } from "../Controls/Notebook/NotebookTerminalComponent";
import Explorer from "../Explorer"; import Explorer from "../Explorer";
import { useNotebook } from "../Notebook/useNotebook"; import { useNotebook } from "../Notebook/useNotebook";
import { CloudShellTerminalComponent } from "./CloudShellTab/CloudShellTabComponent";
import TabsBase from "./TabsBase"; import TabsBase from "./TabsBase";
export interface TerminalTabOptions extends ViewModels.TabOptions { export interface TerminalTabOptions extends ViewModels.TabOptions {
account: DataModels.DatabaseAccount; account: DataModels.DatabaseAccount;
container: Explorer; container: Explorer;
@@ -43,81 +45,98 @@ class NotebookTerminalComponentAdapter implements ReactAdapter {
<QuickstartFirewallNotification <QuickstartFirewallNotification
messageType={MessageTypes.OpenPostgresNetworkingBlade} messageType={MessageTypes.OpenPostgresNetworkingBlade}
screenshot={FirewallRuleScreenshot} screenshot={FirewallRuleScreenshot}
shellName={this.getShellNameForDisplay(this.kind)} shellName={getShellNameForDisplay(this.kind)}
/> />
); );
} }
return this.parameters() ? ( return this.parameters() ? (
<NotebookTerminalComponent <NotebookTerminalComponent
notebookServerInfo={this.getNotebookServerInfo()} notebookServerInfo={this.getNotebookServerInfo()}
databaseAccount={this.getDatabaseAccount()} databaseAccount={this.getDatabaseAccount()}
tabId={this.getTabId()} tabId={this.getTabId()}
username={this.getUsername()} username={this.getUsername()}
/> />): (
<Spinner styles={{ root: { marginTop: 10 } }} size={SpinnerSize.large}></Spinner>
);
}
}
/**
* CloudShell terminal tab
*/
class CloudShellTerminalComponentAdapter implements ReactAdapter {
// parameters: true: show, false: hide
public parameters: ko.Computed<boolean>;
constructor(
private kind: ViewModels.TerminalKind,
) {}
public renderComponent(): JSX.Element {
console.log("this.parameters() " + this.parameters() );
return this.parameters() ? (
<CloudShellTerminalComponent
shellType={this.kind}/>
) : ( ) : (
<Spinner styles={{ root: { marginTop: 10 } }} size={SpinnerSize.large}></Spinner> <Spinner styles={{ root: { marginTop: 10 } }} size={SpinnerSize.large}></Spinner>
); );
} }
}
private getShellNameForDisplay(terminalKind: ViewModels.TerminalKind): string { export const getShellNameForDisplay = (terminalKind: ViewModels.TerminalKind): string => {
switch (terminalKind) { switch (terminalKind) {
case ViewModels.TerminalKind.Postgres: case ViewModels.TerminalKind.Postgres:
return "PostgreSQL"; return "PostgreSQL";
case ViewModels.TerminalKind.Mongo: case ViewModels.TerminalKind.Mongo:
case ViewModels.TerminalKind.VCoreMongo: case ViewModels.TerminalKind.VCoreMongo:
return "MongoDB"; return "MongoDB";
default: default:
return ""; return "";
}
} }
} }
export default class TerminalTab extends TabsBase { export default class TerminalTab extends TabsBase {
public readonly html = '<div style="height: 100%" data-bind="react:notebookTerminalComponentAdapter"></div> '; public readonly html = '<div style="height: 100%" data-bind="react: terminalComponentAdapter"></div>';
private container: Explorer; private container: Explorer;
private notebookTerminalComponentAdapter: NotebookTerminalComponentAdapter; private terminalComponentAdapter: any;
private isAllPublicIPAddressesEnabled: ko.Observable<boolean>; private isAllPublicIPAddressesEnabled: ko.Observable<boolean>;
constructor(options: TerminalTabOptions) { constructor (options: TerminalTabOptions) {
super(options); super(options);
this.container = options.container; this.container = options.container;
this.isAllPublicIPAddressesEnabled = ko.observable(true); this.isAllPublicIPAddressesEnabled = ko.observable(true);
this.notebookTerminalComponentAdapter = new NotebookTerminalComponentAdapter(
() => this.getNotebookServerInfo(options), checkNetworkRules(options.kind, this.isAllPublicIPAddressesEnabled);
() => userContext?.databaseAccount,
() => this.tabId, this.initializeNotebookTerminalAdapter(options);
() => this.getUsername(), }
this.isAllPublicIPAddressesEnabled,
options.kind, private async initializeNotebookTerminalAdapter(options: TerminalTabOptions): Promise<void> {
); if (userContext.features.enableCloudShell) {
this.notebookTerminalComponentAdapter.parameters = ko.computed<boolean>(() => { this.terminalComponentAdapter = new CloudShellTerminalComponentAdapter(
if ( options.kind
);
this.terminalComponentAdapter.parameters = ko.computed<boolean>(() =>
this.isTemplateReady()
);
}
else {
this.terminalComponentAdapter = new NotebookTerminalComponentAdapter(
() => this.getNotebookServerInfo(options),
() => userContext?.databaseAccount,
() => this.tabId,
() => this.getUsername(),
this.isAllPublicIPAddressesEnabled,
options.kind
);
this.terminalComponentAdapter.parameters = ko.computed<boolean>(() =>
this.isTemplateReady() && this.isTemplateReady() &&
useNotebook.getState().isNotebookEnabled && useNotebook.getState().isNotebookEnabled &&
useNotebook.getState().notebookServerInfo?.notebookServerEndpoint && useNotebook.getState().notebookServerInfo?.notebookServerEndpoint &&
this.isAllPublicIPAddressesEnabled() this.isAllPublicIPAddressesEnabled()
) {
return true;
}
return false;
});
if (options.kind === ViewModels.TerminalKind.Postgres) {
checkFirewallRules(
"2022-11-08",
(rule) => rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255",
this.isAllPublicIPAddressesEnabled,
);
}
if (options.kind === ViewModels.TerminalKind.VCoreMongo) {
checkFirewallRules(
"2023-03-01-preview",
(rule) =>
rule.name.startsWith("AllowAllAzureServicesAndResourcesWithinAzureIps") ||
(rule.properties.startIpAddress === "0.0.0.0" && rule.properties.endIpAddress === "255.255.255.255"),
this.isAllPublicIPAddressesEnabled,
); );
} }
} }

View File

@@ -1,4 +1,4 @@
import { Link, Text } from "@fluentui/react"; import { initializeIcons, Link, Text } from "@fluentui/react";
import "bootstrap/dist/css/bootstrap.css"; import "bootstrap/dist/css/bootstrap.css";
import * as React from "react"; import * as React from "react";
import * as ReactDOM from "react-dom"; import * as ReactDOM from "react-dom";
@@ -20,7 +20,7 @@ const createAccountUrl = "https://aka.ms/cosmos-create-account-portal";
const onInit = async () => { const onInit = async () => {
const dataExplorerUrl = new URL("./", window.location.href).href; const dataExplorerUrl = new URL("./", window.location.href).href;
// initializeIcons(); initializeIcons();
await initializeConfiguration(); await initializeConfiguration();
const galleryViewerProps = GalleryUtils.getGalleryViewerProps(window.location.search); const galleryViewerProps = GalleryUtils.getGalleryViewerProps(window.location.search);

View File

@@ -1,4 +1,4 @@
// import { initializeIcons } from "@fluentui/react"; import { initializeIcons } from "@fluentui/react";
import { useBoolean } from "@fluentui/react-hooks"; import { useBoolean } from "@fluentui/react-hooks";
import { AadAuthorizationFailure } from "Platform/Hosted/Components/AadAuthorizationFailure"; import { AadAuthorizationFailure } from "Platform/Hosted/Components/AadAuthorizationFailure";
import * as React from "react"; import * as React from "react";
@@ -22,7 +22,7 @@ import { useAADAuth } from "./hooks/useAADAuth";
import { useConfig } from "./hooks/useConfig"; import { useConfig } from "./hooks/useConfig";
import { useTokenMetadata } from "./hooks/usePortalAccessToken"; import { useTokenMetadata } from "./hooks/usePortalAccessToken";
// initializeIcons(); initializeIcons();
const App: React.FunctionComponent = () => { const App: React.FunctionComponent = () => {
// For handling encrypted portal tokens sent via query paramter // For handling encrypted portal tokens sent via query paramter

View File

@@ -2,7 +2,7 @@
import "./ReactDevTools"; import "./ReactDevTools";
// CSS Dependencies // CSS Dependencies
import { loadTheme } from "@fluentui/react"; import { initializeIcons, loadTheme } from "@fluentui/react";
import { QuickstartCarousel } from "Explorer/Quickstart/QuickstartCarousel"; import { QuickstartCarousel } from "Explorer/Quickstart/QuickstartCarousel";
import { MongoQuickstartTutorial } from "Explorer/Quickstart/Tutorials/MongoQuickstartTutorial"; import { MongoQuickstartTutorial } from "Explorer/Quickstart/Tutorials/MongoQuickstartTutorial";
import { SQLQuickstartTutorial } from "Explorer/Quickstart/Tutorials/SQLQuickstartTutorial"; import { SQLQuickstartTutorial } from "Explorer/Quickstart/Tutorials/SQLQuickstartTutorial";
@@ -62,7 +62,7 @@ import "./Shared/appInsights";
import { useConfig } from "./hooks/useConfig"; import { useConfig } from "./hooks/useConfig";
import { useKnockoutExplorer } from "./hooks/useKnockoutExplorer"; import { useKnockoutExplorer } from "./hooks/useKnockoutExplorer";
// initializeIcons(); initializeIcons();
const App: React.FunctionComponent = () => { const App: React.FunctionComponent = () => {
const isCarouselOpen = useCarousel((state) => state.shouldOpen); const isCarouselOpen = useCarousel((state) => state.shouldOpen);

View File

@@ -1,4 +1,4 @@
// import { initializeIcons } from "@fluentui/react"; import { initializeIcons } from "@fluentui/react";
import "bootstrap/dist/css/bootstrap.css"; import "bootstrap/dist/css/bootstrap.css";
import React from "react"; import React from "react";
import * as ReactDOM from "react-dom"; import * as ReactDOM from "react-dom";
@@ -14,7 +14,7 @@ import { IGalleryItem, JunoClient } from "../Juno/JunoClient";
import * as GalleryUtils from "../Utils/GalleryUtils"; import * as GalleryUtils from "../Utils/GalleryUtils";
const onInit = async () => { const onInit = async () => {
// initializeIcons(); initializeIcons();
await initializeConfiguration(); await initializeConfiguration();
const galleryViewerProps = GalleryUtils.getGalleryViewerProps(window.location.search); const galleryViewerProps = GalleryUtils.getGalleryViewerProps(window.location.search);
const notebookViewerProps = GalleryUtils.getNotebookViewerProps(window.location.search); const notebookViewerProps = GalleryUtils.getNotebookViewerProps(window.location.search);

View File

@@ -1,13 +1,11 @@
import { useBoolean } from "@fluentui/react-hooks"; import { useBoolean } from "@fluentui/react-hooks";
import { userContext } from "UserContext"; import { userContext } from "UserContext";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import * as React from "react"; import * as React from "react";
import ConnectImage from "../../../../images/HdeConnectCosmosDB.svg"; import ConnectImage from "../../../../images/HdeConnectCosmosDB.svg";
import ErrorImage from "../../../../images/error.svg"; import ErrorImage from "../../../../images/error.svg";
import { AuthType } from "../../../AuthType"; import { AuthType } from "../../../AuthType";
import { BackendApi, HttpHeaders } from "../../../Common/Constants"; import { HttpHeaders } from "../../../Common/Constants";
import { configContext } from "../../../ConfigContext"; import { configContext } from "../../../ConfigContext";
import { GenerateTokenResponse } from "../../../Contracts/DataModels";
import { isResourceTokenConnectionString } from "../Helpers/ResourceTokenUtils"; import { isResourceTokenConnectionString } from "../Helpers/ResourceTokenUtils";
interface Props { interface Props {
@@ -19,10 +17,6 @@ interface Props {
} }
export const fetchEncryptedToken = async (connectionString: string): Promise<string> => { export const fetchEncryptedToken = async (connectionString: string): Promise<string> => {
if (!useNewPortalBackendEndpoint(BackendApi.GenerateToken)) {
return await fetchEncryptedToken_ToBeDeprecated(connectionString);
}
const headers = new Headers(); const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString); headers.append(HttpHeaders.connectionString, connectionString);
const url = configContext.PORTAL_BACKEND_ENDPOINT + "/api/connectionstring/token/generatetoken"; const url = configContext.PORTAL_BACKEND_ENDPOINT + "/api/connectionstring/token/generatetoken";
@@ -35,28 +29,10 @@ export const fetchEncryptedToken = async (connectionString: string): Promise<str
return decodeURIComponent(encryptedTokenResponse); return decodeURIComponent(encryptedTokenResponse);
}; };
export const fetchEncryptedToken_ToBeDeprecated = async (connectionString: string): Promise<string> => {
const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString);
const url = configContext.BACKEND_ENDPOINT + "/api/guest/tokens/generateToken";
const response = await fetch(url, { headers, method: "POST" });
if (!response.ok) {
throw response;
}
// This API has a quirk where it must be parsed twice
const result: GenerateTokenResponse = JSON.parse(await response.json());
return decodeURIComponent(result.readWrite || result.read);
};
export const isAccountRestrictedForConnectionStringLogin = async (connectionString: string): Promise<boolean> => { export const isAccountRestrictedForConnectionStringLogin = async (connectionString: string): Promise<boolean> => {
const headers = new Headers(); const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString); headers.append(HttpHeaders.connectionString, connectionString);
const url = configContext.PORTAL_BACKEND_ENDPOINT + "/api/guest/accountrestrictions/checkconnectionstringlogin";
const backendEndpoint: string = useNewPortalBackendEndpoint(BackendApi.AccountRestrictions)
? configContext.PORTAL_BACKEND_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const url = backendEndpoint + "/api/guest/accountrestrictions/checkconnectionstringlogin";
const response = await fetch(url, { headers, method: "POST" }); const response = await fetch(url, { headers, method: "POST" });
if (!response.ok) { if (!response.ok) {
throw response; throw response;

View File

@@ -16,6 +16,7 @@ export type Features = {
readonly enableAadDataPlane: boolean; readonly enableAadDataPlane: boolean;
readonly enableResourceGraph: boolean; readonly enableResourceGraph: boolean;
readonly enableKoResourceTree: boolean; readonly enableKoResourceTree: boolean;
readonly enableThroughputBuckets: boolean;
readonly hostedDataExplorer: boolean; readonly hostedDataExplorer: boolean;
readonly junoEndpoint?: string; readonly junoEndpoint?: string;
readonly phoenixEndpoint?: string; readonly phoenixEndpoint?: string;
@@ -38,7 +39,7 @@ export type Features = {
readonly copilotChatFixedMonacoEditorHeight: boolean; readonly copilotChatFixedMonacoEditorHeight: boolean;
readonly enablePriorityBasedExecution: boolean; readonly enablePriorityBasedExecution: boolean;
readonly disableConnectionStringLogin: boolean; readonly disableConnectionStringLogin: boolean;
readonly restoreTabs: boolean; readonly enableCloudShell: boolean;
// can be set via both flight and feature flag // can be set via both flight and feature flag
autoscaleDefault: boolean; autoscaleDefault: boolean;
@@ -82,6 +83,7 @@ export function extractFeatures(given = new URLSearchParams(window.location.sear
enableSpark: "true" === get("enablespark"), enableSpark: "true" === get("enablespark"),
enableTtl: "true" === get("enablettl"), enableTtl: "true" === get("enablettl"),
enableKoResourceTree: "true" === get("enablekoresourcetree"), enableKoResourceTree: "true" === get("enablekoresourcetree"),
enableThroughputBuckets: "true" === get("enablethroughputbuckets"),
executeSproc: "true" === get("dataexplorerexecutesproc"), executeSproc: "true" === get("dataexplorerexecutesproc"),
hostedDataExplorer: "true" === get("hosteddataexplorerenabled"), hostedDataExplorer: "true" === get("hosteddataexplorerenabled"),
mongoProxyEndpoint: get("mongoproxyendpoint"), mongoProxyEndpoint: get("mongoproxyendpoint"),
@@ -109,7 +111,7 @@ export function extractFeatures(given = new URLSearchParams(window.location.sear
copilotChatFixedMonacoEditorHeight: "true" === get("copilotchatfixedmonacoeditorheight"), copilotChatFixedMonacoEditorHeight: "true" === get("copilotchatfixedmonacoeditorheight"),
enablePriorityBasedExecution: "true" === get("enableprioritybasedexecution"), enablePriorityBasedExecution: "true" === get("enableprioritybasedexecution"),
disableConnectionStringLogin: "true" === get("disableconnectionstringlogin"), disableConnectionStringLogin: "true" === get("disableconnectionstringlogin"),
restoreTabs: "true" === get("restoretabs"), enableCloudShell: "true" === get("enablecloudshell"),
}; };
} }

View File

@@ -1,4 +1,4 @@
import { Spinner, SpinnerSize } from "@fluentui/react"; import { initializeIcons, Spinner, SpinnerSize } from "@fluentui/react";
import * as React from "react"; import * as React from "react";
import ReactDOM from "react-dom"; import ReactDOM from "react-dom";
import { withTranslation } from "react-i18next"; import { withTranslation } from "react-i18next";
@@ -13,7 +13,7 @@ import "./SelfServe.less";
import { SelfServeComponent } from "./SelfServeComponent"; import { SelfServeComponent } from "./SelfServeComponent";
import { SelfServeDescriptor } from "./SelfServeTypes"; import { SelfServeDescriptor } from "./SelfServeTypes";
import { SelfServeType } from "./SelfServeUtils"; import { SelfServeType } from "./SelfServeUtils";
// initializeIcons(); initializeIcons();
const loadTranslationFile = async (className: string): Promise<void> => { const loadTranslationFile = async (className: string): Promise<void> => {
const language = i18n.languages[0]; const language = i18n.languages[0];

View File

@@ -46,14 +46,14 @@ export function decryptJWTToken(token: string) {
return JSON.parse(tokenPayload); return JSON.parse(tokenPayload);
} }
export async function getMsalInstance() { export async function getMsalInstance(clientId: string = "203f1145-856a-4232-83d4-a43568fba23d"){
const msalConfig: msal.Configuration = { const msalConfig: msal.Configuration = {
cache: { cache: {
cacheLocation: "localStorage", cacheLocation: "localStorage",
}, },
auth: { auth: {
authority: `${configContext.AAD_ENDPOINT}organizations`, authority: `${configContext.AAD_ENDPOINT}organizations`,
clientId: "203f1145-856a-4232-83d4-a43568fba23d", clientId: clientId,
}, },
}; };
@@ -68,7 +68,8 @@ export async function getMsalInstance() {
export async function acquireMsalTokenForAccount( export async function acquireMsalTokenForAccount(
account: DatabaseAccount, account: DatabaseAccount,
silent: boolean = false, silent: boolean = false,
user_hint?: string, clientId: string = "203f1145-856a-4232-83d4-a43568fba23d",
user_hint?: string
) { ) {
if (userContext.databaseAccount.properties?.documentEndpoint === undefined) { if (userContext.databaseAccount.properties?.documentEndpoint === undefined) {
throw new Error("Database account has no document endpoint defined"); throw new Error("Database account has no document endpoint defined");
@@ -77,7 +78,7 @@ export async function acquireMsalTokenForAccount(
/\/+$/, /\/+$/,
"/.default", "/.default",
); );
const msalInstance = await getMsalInstance(); const msalInstance = await getMsalInstance(clientId);
const knownAccounts = msalInstance.getAllAccounts(); const knownAccounts = msalInstance.getAllAccounts();
// If user_hint is provided, we will try to use it to find the account. // If user_hint is provided, we will try to use it to find the account.
// If no account is found, we will use the current active account or first account in the list. // If no account is found, we will use the current active account or first account in the list.

View File

@@ -1,11 +1,4 @@
import { import { CassandraProxyEndpoints, JunoEndpoints, MongoProxyEndpoints, PortalBackendEndpoints } from "Common/Constants";
BackendApi,
CassandraProxyEndpoints,
JunoEndpoints,
MongoProxyEndpoints,
PortalBackendEndpoints,
} from "Common/Constants";
import { configContext } from "ConfigContext";
import * as Logger from "../Common/Logger"; import * as Logger from "../Common/Logger";
export function validateEndpoint( export function validateEndpoint(
@@ -73,9 +66,6 @@ export const PortalBackendIPs: { [key: string]: string[] } = {
//"https://main2.documentdb.ext.azure.com": ["104.42.196.69"], //"https://main2.documentdb.ext.azure.com": ["104.42.196.69"],
"https://main.documentdb.ext.azure.cn": ["139.217.8.252"], "https://main.documentdb.ext.azure.cn": ["139.217.8.252"],
"https://main.documentdb.ext.azure.us": ["52.244.48.71"], "https://main.documentdb.ext.azure.us": ["52.244.48.71"],
// Add ussec and usnat when endpoint address is known:
//ussec: ["29.26.26.67", "29.26.26.66"],
//usnat: ["7.28.202.68"],
}; };
export const PortalBackendOutboundIPs: { [key: string]: string[] } = { export const PortalBackendOutboundIPs: { [key: string]: string[] } = {
@@ -100,14 +90,6 @@ export const defaultAllowedMongoProxyEndpoints: ReadonlyArray<string> = [
MongoProxyEndpoints.Mooncake, MongoProxyEndpoints.Mooncake,
]; ];
export const allowedMongoProxyEndpoints_ToBeDeprecated: ReadonlyArray<string> = [
"https://main.documentdb.ext.azure.com",
"https://main.documentdb.ext.azure.cn",
"https://main.documentdb.ext.azure.us",
"https://main.cosmos.ext.azure",
"https://localhost:12901",
];
export const defaultAllowedCassandraProxyEndpoints: ReadonlyArray<string> = [ export const defaultAllowedCassandraProxyEndpoints: ReadonlyArray<string> = [
CassandraProxyEndpoints.Development, CassandraProxyEndpoints.Development,
CassandraProxyEndpoints.Mpac, CassandraProxyEndpoints.Mpac,
@@ -141,9 +123,7 @@ export const allowedArcadiaEndpoints: ReadonlyArray<string> = ["https://workspac
export const allowedHostedExplorerEndpoints: ReadonlyArray<string> = ["https://cosmos.azure.com/"]; export const allowedHostedExplorerEndpoints: ReadonlyArray<string> = ["https://cosmos.azure.com/"];
export const allowedMsalRedirectEndpoints: ReadonlyArray<string> = [ export const allowedMsalRedirectEndpoints: ReadonlyArray<string> = ["https://dataexplorer-preview.azurewebsites.net/"];
"https://cosmos-explorer-preview.azurewebsites.net/",
];
export const allowedJunoOrigins: ReadonlyArray<string> = [ export const allowedJunoOrigins: ReadonlyArray<string> = [
JunoEndpoints.Test, JunoEndpoints.Test,
@@ -155,53 +135,3 @@ export const allowedJunoOrigins: ReadonlyArray<string> = [
]; ];
export const allowedNotebookServerUrls: ReadonlyArray<string> = []; export const allowedNotebookServerUrls: ReadonlyArray<string> = [];
//
// Temporary function to determine if a portal backend API is supported by the
// new backend in this environment.
//
// TODO: Remove this function once new backend migration is completed for all environments.
//
export function useNewPortalBackendEndpoint(backendApi: string): boolean {
// This maps backend APIs to the environments supported by the new backend.
const newBackendApiEnvironmentMap: { [key: string]: string[] } = {
[BackendApi.GenerateToken]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.PortalSettings]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.AccountRestrictions]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.RuntimeProxy]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.DisallowedLocations]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
PortalBackendEndpoints.Fairfax,
PortalBackendEndpoints.Mooncake,
],
[BackendApi.SampleData]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
};
if (!newBackendApiEnvironmentMap[backendApi] || !configContext.PORTAL_BACKEND_ENDPOINT) {
return false;
}
return newBackendApiEnvironmentMap[backendApi].includes(configContext.PORTAL_BACKEND_ENDPOINT);
}

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Lists the Cassandra keyspaces under an existing Azure Cosmos DB database account. */ /* Lists the Cassandra keyspaces under an existing Azure Cosmos DB database account. */
export async function listCassandraKeyspaces( export async function listCassandraKeyspaces(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and collection. */ /* Retrieves the metrics determined by the given filter for the given database account and collection. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given collection, split by partition. */ /* Retrieves the metrics determined by the given filter for the given collection, split by partition. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given collection and region, split by partition. */ /* Retrieves the metrics determined by the given filter for the given collection and region, split by partition. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account, collection and region. */ /* Retrieves the metrics determined by the given filter for the given database account, collection and region. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and database. */ /* Retrieves the metrics determined by the given filter for the given database account and database. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and region. */ /* Retrieves the metrics determined by the given filter for the given database account and region. */
export async function listMetrics( export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Retrieves the properties of an existing Azure Cosmos DB database account. */ /* Retrieves the properties of an existing Azure Cosmos DB database account. */
export async function get( export async function get(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Lists the graphs under an existing Azure Cosmos DB database account. */ /* Lists the graphs under an existing Azure Cosmos DB database account. */
export async function listGraphs( export async function listGraphs(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Lists the Gremlin databases under an existing Azure Cosmos DB database account. */ /* Lists the Gremlin databases under an existing Azure Cosmos DB database account. */
export async function listGremlinDatabases( export async function listGremlinDatabases(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* List Cosmos DB locations and their properties */ /* List Cosmos DB locations and their properties */
export async function list(subscriptionId: string): Promise<Types.LocationListResult | Types.CloudError> { export async function list(subscriptionId: string): Promise<Types.LocationListResult | Types.CloudError> {

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Lists the MongoDB databases under an existing Azure Cosmos DB database account. */ /* Lists the MongoDB databases under an existing Azure Cosmos DB database account. */
export async function listMongoDBDatabases( export async function listMongoDBDatabases(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/ */
import { configContext } from "../../../../ConfigContext"; import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request"; import { armRequest } from "../../request";
import * as Types from "./types"; import * as Types from "./types";
const apiVersion = "2024-02-15-preview"; const apiVersion = "2024-12-01-preview";
/* Lists all of the available Cosmos DB Resource Provider operations. */ /* Lists all of the available Cosmos DB Resource Provider operations. */
export async function list(): Promise<Types.OperationListResult> { export async function list(): Promise<Types.OperationListResult> {

Some files were not shown because too many files have changed in this diff Show More