Compare commits

...

31 Commits

Author SHA1 Message Date
jawelton74
a04eaff6be Add Tables to missing api type checks for dataplane RBAC. (#2060)
* Add Tables to missing api type checks for dataplane RBAC.

* Comment out test that is broken due to invalid hook call error.
2025-02-20 08:15:53 -08:00
jawelton74
51a412e2c0 Change value of the example SelfServeType enum to match name of (#2062)
localization file.
2025-02-20 07:06:25 -08:00
SATYA SB
3fcbdf6152 [accessibility-3739790-3739677]:[Forms and Validation - Azure Cosmos DB- Data Explorer - New Vertex]: Visual Label is not defined for Key, Value and Type input fields under 'New Vertex' pane. (#2040)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-19 11:26:15 +05:30
SATYA SB
8da078579e [accessibility-3739618]:[Screen Reader - Azure Cosmos DB- Data Explorer - Graphs]: Screen Reader announces both expanded and collapsed information simultaneously for expand/collapse button in bottom notification region under 'Data Explorer' pane. (#2048)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-19 11:25:44 +05:30
vchske
4ac41031e6 Fixing SelfServeType enum to work in MPAC (#2057) 2025-02-18 09:59:51 -08:00
jawelton74
d7923db108 Add Tables as an API type that supports dataplane RBAC. (#2056) 2025-02-18 09:29:53 -08:00
SATYA SB
0170c9e1cc [accessibility-3739182]:[Visual Requirement - Azure Cosmos DB - Add Row]: Ensures the contrast between foreground and background colors meets WCAG 2 AA minimum contrast ratio thresholds. (#2054)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-14 11:53:01 +05:30
bogercraig
2730da7ab6 Backend Migration - Remove Use of Legacy Backend from DE (#2043)
* Default to new backend endpoint if the endpoint in current context does not match existing set in constants.

* Remove some env references.

* Added comments with reasoning for selecting new backend by default.

* Update comment.

* Remove all references to useNewPortalBackendEndpoint now that old backend is disabled in all environments.

* Resolve lint issues.

* Removed references to old backend from Cassandra and Mongo Apis

* fix unit tests

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-12 18:12:59 -08:00
sunghyunkang1111
de2449ee25 Adding throughput bucket settings in Data Explorer (#2044)
* Added throughput bucketing

* fix bugs

* enable/disable per autoscale selection

* Added logic

* change query bucket to group

* Updated to a tab

* Fixed unit tests

* Edit package-lock

* Compile build fix

* fix unit tests

* moving the throughput bucket flag to the client generation level
2025-02-12 13:10:07 -06:00
sunghyunkang1111
99378582ce Remove blocking await on sample database (#2047)
* Remove blocking await on sample database

* Remove compress flag to reduce bundle size

* Fix typo in webpack config comment date
2025-02-12 13:09:52 -06:00
SATYA SB
bd592d07af [accessibility-1217621]: Keyboard focus gets lost on the page which opens after activating "Data Explorer" menu item present under 'Overview' page. (#1927)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-12 11:31:30 +05:30
asier-isayas
644f5941ec Set default throughput based on account's workload type (#2021)
* assign default throughput based on workload type

* combined common logic

* fix unit tests

* add tests

* update tests

* npm run format

* Update ci.yml

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-11 17:47:55 -05:00
jawelton74
9fb006a996 Restore DisplayNPSSurvey message type enum which was removed in a prior (#2046)
change.
2025-02-11 06:58:44 -08:00
jawelton74
c2b98c3e23 Modify E2E cleanup script to use @azure/identity for AZ credentials. (#2051) 2025-02-10 08:48:26 -08:00
Nishtha Ahuja
76d49d86d4 Added emulator checks in settings pane fields (#2041)
* added emulator checks

* created macro

* conditions as const

---------

Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-10 11:52:56 +05:30
Laurent Nguyen
7893b89bf7 Do not open first container if a tab is already open (#2045)
Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2025-02-06 21:58:38 +01:00
JustinKol
5945e3cb6b Removed NPS Survey from DE since it has been moved to the Overview Blade (#2027)
* Removed NPS Survey from DE since it has been moved to the Overview Blade

* Added ExplorerBindings back

* Moved applyExplorerBindings back to original place
2025-02-05 13:30:03 -05:00
Laurent Nguyen
213d1c68fe Remove feature switch on restore tabs (#2039) 2025-02-03 17:59:00 +01:00
Nishtha Ahuja
c26f9a1ebb disabled change buttom for emulator (#2017)
Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-03 12:39:01 +05:30
SATYA SB
bd7cd7ae8f [accessibility-3556793]: [Screen Reader- Azure Cosmos DB- Data Explorer]: The Learn more links are not descriptive present under the settings. (#2035)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:58:44 +05:30
SATYA SB
6504358580 [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings. (#2016)
* [accessibility-3556824] : [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings.

* Snapshots updated.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:53:18 +05:30
SATYA SB
ce88659fca [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button. (#2010)
* [accessibility-3560073]: [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

* [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:48:34 +05:30
SATYA SB
642c708e9c [accessibility-3556756]: [Programmatic Access- Azure Cosmos DB- Data explorer]: Ensures <img> elements have alternate text or a role of none or presentation. (#2007)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:45:49 +05:30
SATYA SB
4156009d09 [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button. (#2002)
* [accessibility-3549715]: [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

* [accessibility-3549715]:[Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:44:32 +05:30
SATYA SB
5c6abbd635 [accessibility-3556595]: [Programmatic Access- Azure Cosmos DB- Data Explorer]: Ensures role attribute has an appropriate value for the element. (#2001)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:37:11 +05:30
jawelton74
881726e9af New preview site (#2036)
* Changes to DE preview site to support managed identity. Changes to
infrastructure to use new preview site.

* Fix formatting.

* Potential fix for code scanning alert no. 56: Server-side request forgery

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* Use different secrets for subscription/tenant/client id's.

* Revert new id names.

* Update Az CLI config.

* Update to Node 18 and update security vulnerable dependencies.

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-01-30 16:14:03 -08:00
jawelton74
7015590d1a Remove hard coded client and subscription Ids from webpack config. (#2033) 2025-01-24 07:23:33 -08:00
jawelton74
1d952a4ea2 Remove throughput survey text and link from Throughput tab. (#2031) 2025-01-21 10:53:47 -08:00
jawelton74
2a81551a60 Use unique names in upload artifacts tasks (#2030)
* Specify actual package names in upload artifacts task.

* Revert path change, use unique names for upload task.

* Fix the right properties.

* Revert condition change
2025-01-21 07:07:53 -08:00
jawelton74
eceee36913 Use azure identity package for e2e test credentials (#2032)
* Update identity package, remove ms-rest-nodeauth package.

* Test changes to use identity package.
2025-01-21 07:07:18 -08:00
jawelton74
96faf92c12 Use dotnet CLI for nuget operations in CI pipeline (#2026)
* Start of moving nuget actions to use dotnet.

* Comment out env section

* Set auth token.

* Disable globalization support.

* Comment out dotnet setup.

* Copy proj file with build.

* PLace Content item under ItemGroup.

* Update project with Sdk and No Build args.

* Remove no-build from cmd line.

* Set TargetFramework version.

* Fix TargetFramework value.

* Add nuget push command.

* Fix test version string

* Add nuget add source step.

* Fix add source args.

* Enable cleartext password, remove source after completion.

* Use wildcard for nupkg path. Add debug.

* Remove debug.

* Fix nupkg path

* Fix API key argument

* Re-enable MPAC nuget. Tidy up ci.yml.

* Fix formatting of webpack config.

* Remove Globalization flag.

* Revert test changes.
2025-01-15 11:37:30 -08:00
94 changed files with 2069 additions and 2239 deletions

View File

@@ -1 +1 @@
[Preview this branch](https://cosmos-explorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true)
[Preview this branch](https://dataexplorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true)

View File

@@ -83,7 +83,7 @@ jobs:
- run: npm ci
- run: npm run build:contracts
- name: Restore Build Cache
uses: actions/cache@v2
uses: actions/cache@v4
with:
path: .cache
key: ${{ runner.os }}-build-cache
@@ -96,14 +96,16 @@ jobs:
with:
name: dist
path: dist/
- name: "Az CLI login"
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.PREVIEW_SUBSCRIPTION_ID }}
- name: Upload build to preview blob storage
run: az storage blob upload-batch -d '$web' -s 'dist' --account-name cosmosexplorerpreview --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
run: az storage blob upload-batch -d '$web' -s 'dist' --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --auth-mode login --overwrite true
- name: Upload preview config to blob storage
run: az storage blob upload -c '$web' -f ./preview/config.json --account-name cosmosexplorerpreview --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
run: az storage blob upload -c '$web' -f ./preview/config.json --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --auth-mode login --overwrite true
nuget:
name: Publish Nuget
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -113,21 +115,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps:
- uses: nuget/setup-nuget@v2
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder
uses: actions/download-artifact@v4
with:
name: dist
- run: cp ./configs/prod.json config.json
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT"
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg
- run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4
name: packages
name: Upload package to Artifacts
with:
path: "*.nupkg"
name: prod-package
path: "bin/Release/*.nupkg"
nugetmpac:
name: Publish Nuget MPAC
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -137,22 +139,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps:
- uses: nuget/setup-nuget@v2
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder
uses: actions/download-artifact@v4
with:
name: dist
- run: cp ./configs/mpac.json config.json
- run: sed -i 's/Azure.Cosmos.DB.Data.Explorer/Azure.Cosmos.DB.Data.Explorer.MPAC/g' DataExplorer.nuspec
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT"
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg
- run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4
name: packages
name: Upload package to Artifacts
with:
path: "*.nupkg"
name: mpac-package
path: "bin/Release/*.nupkg"
playwright-tests:
name: "Run Playwright Tests (Shard ${{ matrix.shardIndex }} of ${{ matrix.shardTotal }})"

9
DataExplorer.proj Normal file
View File

@@ -0,0 +1,9 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<NoBuild>true</NoBuild>
<IncludeBuildOutput>false</IncludeBuildOutput>
<NuspecFile>DataExplorer.nuspec</NuspecFile>
<NuspecProperties>version=$(PackageVersion)</NuspecProperties>
</PropertyGroup>
</Project>

View File

@@ -61,6 +61,8 @@
@GalleryBackgroundColor: #fdfdfd;
@LinkColor: #2d6da4;
//Icons
@InfoIconColor: #0072c6;
@WarningIconColor: #db7500;

View File

@@ -1830,6 +1830,14 @@ input::-webkit-calendar-picker-indicator::after {
transform: rotate(90deg);
}
.customAccordion button:focus {
.focus();
}
.customAccordion {
margin-top: 1px;
}
.datalist-arrow:after:hover {
content: "\276F";
position: absolute;

516
package-lock.json generated
View File

@@ -12,8 +12,7 @@
"@azure/arm-cosmosdb": "9.1.0",
"@azure/cosmos": "4.2.0-beta.1",
"@azure/cosmos-language-service": "0.0.5",
"@azure/identity": "1.5.2",
"@azure/ms-rest-nodeauth": "3.1.1",
"@azure/identity": "4.5.0",
"@azure/msal-browser": "2.14.2",
"@babel/plugin-proposal-class-properties": "7.12.1",
"@babel/plugin-proposal-decorators": "7.12.12",
@@ -428,47 +427,68 @@
"license": "0BSD"
},
"node_modules/@azure/identity": {
"version": "1.5.2",
"license": "MIT",
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/@azure/identity/-/identity-4.5.0.tgz",
"integrity": "sha512-EknvVmtBuSIic47xkOqyNabAme0RYTw52BTMz8eBgU1ysTyMrD1uOoM+JdS0J/4Yfp98IBT3osqq3BfwSaNaGQ==",
"dependencies": {
"@azure/core-auth": "^1.3.0",
"@azure/core-client": "^1.0.0",
"@azure/core-rest-pipeline": "^1.1.0",
"@azure/core-tracing": "1.0.0-preview.12",
"@azure/abort-controller": "^2.0.0",
"@azure/core-auth": "^1.9.0",
"@azure/core-client": "^1.9.2",
"@azure/core-rest-pipeline": "^1.17.0",
"@azure/core-tracing": "^1.0.0",
"@azure/core-util": "^1.11.0",
"@azure/logger": "^1.0.0",
"@azure/msal-node": "1.0.0-beta.6",
"@types/stoppable": "^1.1.0",
"axios": "^0.21.1",
"@azure/msal-browser": "^3.26.1",
"@azure/msal-node": "^2.15.0",
"events": "^3.0.0",
"jws": "^4.0.0",
"msal": "^1.0.2",
"open": "^7.0.0",
"qs": "^6.7.0",
"open": "^8.0.0",
"stoppable": "^1.1.0",
"tslib": "^2.0.0",
"uuid": "^8.3.0"
},
"engines": {
"node": ">=12.0.0"
},
"optionalDependencies": {
"keytar": "^7.3.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/core-tracing": {
"version": "1.0.0-preview.12",
"license": "MIT",
"dependencies": {
"@opentelemetry/api": "^1.0.0",
"tslib": "^2.2.0"
},
"engines": {
"node": ">=12.0.0"
"node": ">=18.0.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/msal-browser": {
"version": "3.28.1",
"resolved": "https://registry.npmjs.org/@azure/msal-browser/-/msal-browser-3.28.1.tgz",
"integrity": "sha512-OHHEWMB5+Zrix8yKvLVzU3rKDFvh7SOzAzXfICD7YgUXLxfHpTPX2pzOotrri1kskwhHqIj4a5LvhZlIqE7C7g==",
"dependencies": {
"@azure/msal-common": "14.16.0"
},
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/@azure/identity/node_modules/@azure/msal-common": {
"version": "14.16.0",
"resolved": "https://registry.npmjs.org/@azure/msal-common/-/msal-common-14.16.0.tgz",
"integrity": "sha512-1KOZj9IpcDSwpNiQNjt0jDYZpQvNZay7QAEi/5DLubay40iGYtLzya/jbjRPLyOTZhEKyL1MzPuw2HqBCjceYA==",
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/@azure/identity/node_modules/open": {
"version": "8.4.2",
"resolved": "https://registry.npmjs.org/open/-/open-8.4.2.tgz",
"integrity": "sha512-7x81NCL719oNbsq/3mh+hVrAWmFuEYUqrq/Iw3kUzH8ReypT9QQ0BLoJS7/G9k6N81XjW4qHWtjWwe/9eLy1EQ==",
"dependencies": {
"define-lazy-prop": "^2.0.0",
"is-docker": "^2.1.1",
"is-wsl": "^2.2.0"
},
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/@azure/identity/node_modules/tslib": {
"version": "2.6.2",
"license": "0BSD"
"version": "2.8.1",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
"integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="
},
"node_modules/@azure/logger": {
"version": "1.0.4",
@@ -484,10 +504,6 @@
"version": "2.6.2",
"license": "0BSD"
},
"node_modules/@azure/ms-rest-azure-env": {
"version": "2.0.0",
"license": "MIT"
},
"node_modules/@azure/ms-rest-azure-js": {
"version": "2.1.0",
"license": "MIT",
@@ -559,15 +575,6 @@
"node": ">=4.0"
}
},
"node_modules/@azure/ms-rest-nodeauth": {
"version": "3.1.1",
"license": "MIT",
"dependencies": {
"@azure/ms-rest-azure-env": "^2.0.0",
"@azure/ms-rest-js": "^2.0.4",
"adal-node": "^0.2.2"
}
},
"node_modules/@azure/msal-browser": {
"version": "2.14.2",
"license": "MIT",
@@ -589,13 +596,24 @@
}
},
"node_modules/@azure/msal-node": {
"version": "1.0.0-beta.6",
"license": "MIT",
"version": "2.16.2",
"resolved": "https://registry.npmjs.org/@azure/msal-node/-/msal-node-2.16.2.tgz",
"integrity": "sha512-An7l1hEr0w1HMMh1LU+rtDtqL7/jw74ORlc9Wnh06v7TU/xpG39/Zdr1ZJu3QpjUfKJ+E0/OXMW8DRSWTlh7qQ==",
"dependencies": {
"@azure/msal-common": "^4.0.0",
"axios": "^0.21.1",
"jsonwebtoken": "^8.5.1",
"@azure/msal-common": "14.16.0",
"jsonwebtoken": "^9.0.0",
"uuid": "^8.3.0"
},
"engines": {
"node": ">=16"
}
},
"node_modules/@azure/msal-node/node_modules/@azure/msal-common": {
"version": "14.16.0",
"resolved": "https://registry.npmjs.org/@azure/msal-common/-/msal-common-14.16.0.tgz",
"integrity": "sha512-1KOZj9IpcDSwpNiQNjt0jDYZpQvNZay7QAEi/5DLubay40iGYtLzya/jbjRPLyOTZhEKyL1MzPuw2HqBCjceYA==",
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/@babel/code-frame": {
@@ -10073,13 +10091,6 @@
"@octokit/openapi-types": "^19.0.2"
}
},
"node_modules/@opentelemetry/api": {
"version": "1.8.0",
"license": "Apache-2.0",
"engines": {
"node": ">=8.0.0"
}
},
"node_modules/@phosphor/algorithm": {
"version": "1.2.0",
"license": "BSD-3-Clause"
@@ -12725,13 +12736,6 @@
"devOptional": true,
"license": "MIT"
},
"node_modules/@types/stoppable": {
"version": "1.1.3",
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@types/styled-components": {
"version": "5.1.1",
"dev": true,
@@ -13334,61 +13338,6 @@
"node": ">=0.4.0"
}
},
"node_modules/adal-node": {
"version": "0.2.4",
"license": "Apache-2.0",
"dependencies": {
"@xmldom/xmldom": "^0.8.3",
"async": "^2.6.3",
"axios": "^0.21.1",
"date-utils": "*",
"jws": "3.x.x",
"underscore": ">= 1.3.1",
"uuid": "^3.1.0",
"xpath.js": "~1.1.0"
},
"engines": {
"node": ">= 0.6.15"
}
},
"node_modules/adal-node/node_modules/@xmldom/xmldom": {
"version": "0.8.10",
"license": "MIT",
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/adal-node/node_modules/async": {
"version": "2.6.4",
"license": "MIT",
"dependencies": {
"lodash": "^4.17.14"
}
},
"node_modules/adal-node/node_modules/jwa": {
"version": "1.4.1",
"license": "MIT",
"dependencies": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"node_modules/adal-node/node_modules/jws": {
"version": "3.2.2",
"license": "MIT",
"dependencies": {
"jwa": "^1.4.1",
"safe-buffer": "^5.0.1"
}
},
"node_modules/adal-node/node_modules/uuid": {
"version": "3.4.0",
"license": "MIT",
"bin": {
"uuid": "bin/uuid"
}
},
"node_modules/address": {
"version": "1.1.2",
"dev": true,
@@ -14021,13 +13970,6 @@
"version": "1.12.0",
"license": "MIT"
},
"node_modules/axios": {
"version": "0.21.4",
"license": "MIT",
"dependencies": {
"follow-redirects": "^1.14.0"
}
},
"node_modules/babel-core": {
"version": "7.0.0-bridge.0",
"dev": true,
@@ -14731,7 +14673,7 @@
},
"node_modules/base64-js": {
"version": "1.5.1",
"devOptional": true,
"dev": true,
"funding": [
{
"type": "github",
@@ -14803,8 +14745,9 @@
},
"node_modules/bl": {
"version": "4.1.0",
"devOptional": true,
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"buffer": "^5.5.0",
"inherits": "^2.0.4",
@@ -14813,7 +14756,7 @@
},
"node_modules/bl/node_modules/buffer": {
"version": "5.7.1",
"devOptional": true,
"dev": true,
"funding": [
{
"type": "github",
@@ -14829,6 +14772,7 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"base64-js": "^1.3.1",
"ieee754": "^1.1.13"
@@ -14836,8 +14780,9 @@
},
"node_modules/bl/node_modules/readable-stream": {
"version": "3.6.2",
"devOptional": true,
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"inherits": "^2.0.3",
"string_decoder": "^1.1.1",
@@ -15407,11 +15352,6 @@
"node": ">=8.0"
}
},
"node_modules/chownr": {
"version": "1.1.4",
"license": "ISC",
"optional": true
},
"node_modules/chrome-trace-event": {
"version": "1.0.3",
"license": "MIT",
@@ -17175,13 +17115,6 @@
"version": "1.29.0",
"license": "MIT"
},
"node_modules/date-utils": {
"version": "1.2.21",
"license": "MIT",
"engines": {
"node": ">0.4.0"
}
},
"node_modules/dayjs": {
"version": "1.8.19",
"license": "MIT"
@@ -17276,14 +17209,6 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/deep-extend": {
"version": "0.6.0",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=4.0.0"
}
},
"node_modules/deep-is": {
"version": "0.1.4",
"license": "MIT"
@@ -17344,7 +17269,6 @@
},
"node_modules/define-lazy-prop": {
"version": "2.0.0",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8"
@@ -18976,14 +18900,6 @@
"version": "2.0.0",
"license": "MIT"
},
"node_modules/expand-template": {
"version": "2.0.3",
"license": "(MIT OR WTFPL)",
"optional": true,
"engines": {
"node": ">=6"
}
},
"node_modules/expect": {
"version": "29.7.0",
"resolved": "https://registry.npmjs.org/expect/-/expect-29.7.0.tgz",
@@ -19671,6 +19587,7 @@
},
"node_modules/follow-redirects": {
"version": "1.15.3",
"dev": true,
"funding": [
{
"type": "individual",
@@ -19978,11 +19895,6 @@
"node": ">= 0.6"
}
},
"node_modules/fs-constants": {
"version": "1.0.0",
"license": "MIT",
"optional": true
},
"node_modules/fs-extra": {
"version": "7.0.0",
"dev": true,
@@ -20197,11 +20109,6 @@
"assert-plus": "^1.0.0"
}
},
"node_modules/github-from-package": {
"version": "0.0.0",
"license": "MIT",
"optional": true
},
"node_modules/glob": {
"version": "7.2.3",
"license": "ISC",
@@ -21333,7 +21240,7 @@
},
"node_modules/ieee754": {
"version": "1.2.1",
"devOptional": true,
"dev": true,
"funding": [
{
"type": "github",
@@ -21522,7 +21429,7 @@
},
"node_modules/ini": {
"version": "1.3.8",
"devOptional": true,
"dev": true,
"license": "ISC"
},
"node_modules/internal-slot": {
@@ -27558,8 +27465,9 @@
}
},
"node_modules/jsonwebtoken": {
"version": "8.5.1",
"license": "MIT",
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-PRp66vJ865SSqOlgqS8hujT5U4AOgMfhrwYIuIhfKaoSCZcirrmASQr8CX7cUg+RMih+hgznrjp99o+W4pJLHQ==",
"dependencies": {
"jws": "^3.2.2",
"lodash.includes": "^4.3.0",
@@ -27570,16 +27478,17 @@
"lodash.isstring": "^4.0.1",
"lodash.once": "^4.0.0",
"ms": "^2.1.1",
"semver": "^5.6.0"
"semver": "^7.5.4"
},
"engines": {
"node": ">=4",
"npm": ">=1.4.28"
"node": ">=12",
"npm": ">=6"
}
},
"node_modules/jsonwebtoken/node_modules/jwa": {
"version": "1.4.1",
"license": "MIT",
"resolved": "https://registry.npmjs.org/jwa/-/jwa-1.4.1.tgz",
"integrity": "sha512-qiLX/xhEEFKUAJ6FiBMbes3w9ATzyk5W7Hvzpa/SLYdxNtng+gcurvrI7TbACjIXlsJyr05/S1oUhZrc63evQA==",
"dependencies": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
@@ -27588,12 +27497,24 @@
},
"node_modules/jsonwebtoken/node_modules/jws": {
"version": "3.2.2",
"license": "MIT",
"resolved": "https://registry.npmjs.org/jws/-/jws-3.2.2.tgz",
"integrity": "sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA==",
"dependencies": {
"jwa": "^1.4.1",
"safe-buffer": "^5.0.1"
}
},
"node_modules/jsonwebtoken/node_modules/semver": {
"version": "7.6.3",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.6.3.tgz",
"integrity": "sha512-oVekP1cKtI+CTDvHWYFUcMtsK/00wmAEfyqKfNdARm8u1wNVhSgaX7A8d4UuIlUI5e84iEwOhs7ZPYRmzU9U6A==",
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/jsprim": {
"version": "1.4.2",
"license": "MIT",
@@ -27646,16 +27567,6 @@
"version": "2.6.0",
"license": "MIT"
},
"node_modules/keytar": {
"version": "7.9.0",
"hasInstallScript": true,
"license": "MIT",
"optional": true,
"dependencies": {
"node-addon-api": "^4.3.0",
"prebuild-install": "^7.0.1"
}
},
"node_modules/keyv": {
"version": "4.5.4",
"license": "MIT",
@@ -27972,7 +27883,8 @@
},
"node_modules/lodash.includes": {
"version": "4.3.0",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.includes/-/lodash.includes-4.3.0.tgz",
"integrity": "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="
},
"node_modules/lodash.invokemap": {
"version": "4.6.0",
@@ -27981,7 +27893,8 @@
},
"node_modules/lodash.isboolean": {
"version": "3.0.3",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz",
"integrity": "sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg=="
},
"node_modules/lodash.isequal": {
"version": "4.5.0",
@@ -27989,19 +27902,23 @@
},
"node_modules/lodash.isinteger": {
"version": "4.0.4",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.isinteger/-/lodash.isinteger-4.0.4.tgz",
"integrity": "sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA=="
},
"node_modules/lodash.isnumber": {
"version": "3.0.3",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.isnumber/-/lodash.isnumber-3.0.3.tgz",
"integrity": "sha512-QYqzpfwO3/CWf3XP+Z+tkQsfaLL/EnUlXWVkIk5FUPc4sBdTehEqZONuyRt2P67PXAk+NXmTBcc97zw9t1FQrw=="
},
"node_modules/lodash.isplainobject": {
"version": "4.0.6",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA=="
},
"node_modules/lodash.isstring": {
"version": "4.0.1",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.isstring/-/lodash.isstring-4.0.1.tgz",
"integrity": "sha512-0wJxfxH1wgO3GrbuP+dTTk7op+6L41QCXbGINEmD+ny/G/eCqGzxyCsh7159S+mgDDcoarnBw6PC1PS5+wUGgw=="
},
"node_modules/lodash.memoize": {
"version": "4.1.2",
@@ -28013,7 +27930,8 @@
},
"node_modules/lodash.once": {
"version": "4.1.1",
"license": "MIT"
"resolved": "https://registry.npmjs.org/lodash.once/-/lodash.once-4.1.1.tgz",
"integrity": "sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg=="
},
"node_modules/lodash.pullall": {
"version": "4.2.0",
@@ -29526,11 +29444,6 @@
"node": ">=10"
}
},
"node_modules/mkdirp-classic": {
"version": "0.5.3",
"license": "MIT",
"optional": true
},
"node_modules/moment": {
"version": "2.29.4",
"license": "MIT",
@@ -29597,16 +29510,6 @@
"version": "2.1.3",
"license": "MIT"
},
"node_modules/msal": {
"version": "1.4.18",
"license": "MIT",
"dependencies": {
"tslib": "^1.9.3"
},
"engines": {
"node": ">=0.8.0"
}
},
"node_modules/multicast-dns": {
"version": "7.2.5",
"dev": true,
@@ -29667,11 +29570,6 @@
"node": ">=0.10.0"
}
},
"node_modules/napi-build-utils": {
"version": "1.0.2",
"license": "MIT",
"optional": true
},
"node_modules/native-promise-only": {
"version": "0.8.1",
"dev": true,
@@ -29756,58 +29654,12 @@
"node": ">=12.0.0"
}
},
"node_modules/node-abi": {
"version": "3.60.0",
"license": "MIT",
"optional": true,
"dependencies": {
"semver": "^7.3.5"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/lru-cache": {
"version": "6.0.0",
"license": "ISC",
"optional": true,
"dependencies": {
"yallist": "^4.0.0"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/semver": {
"version": "7.6.0",
"license": "ISC",
"optional": true,
"dependencies": {
"lru-cache": "^6.0.0"
},
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-abi/node_modules/yallist": {
"version": "4.0.0",
"license": "ISC",
"optional": true
},
"node_modules/node-abort-controller": {
"version": "3.1.1",
"dev": true,
"license": "MIT",
"peer": true
},
"node_modules/node-addon-api": {
"version": "4.3.0",
"license": "MIT",
"optional": true
},
"node_modules/node-dir": {
"version": "0.1.17",
"dev": true,
@@ -31006,80 +30858,6 @@
"version": "4.2.0",
"license": "MIT"
},
"node_modules/prebuild-install": {
"version": "7.1.2",
"license": "MIT",
"optional": true,
"dependencies": {
"detect-libc": "^2.0.0",
"expand-template": "^2.0.3",
"github-from-package": "0.0.0",
"minimist": "^1.2.3",
"mkdirp-classic": "^0.5.3",
"napi-build-utils": "^1.0.1",
"node-abi": "^3.3.0",
"pump": "^3.0.0",
"rc": "^1.2.7",
"simple-get": "^4.0.0",
"tar-fs": "^2.0.0",
"tunnel-agent": "^0.6.0"
},
"bin": {
"prebuild-install": "bin.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/prebuild-install/node_modules/decompress-response": {
"version": "6.0.0",
"license": "MIT",
"optional": true,
"dependencies": {
"mimic-response": "^3.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/prebuild-install/node_modules/mimic-response": {
"version": "3.1.0",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/prebuild-install/node_modules/simple-get": {
"version": "4.0.1",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT",
"optional": true,
"dependencies": {
"decompress-response": "^6.0.0",
"once": "^1.3.1",
"simple-concat": "^1.0.0"
}
},
"node_modules/prelude-ls": {
"version": "1.2.1",
"license": "MIT",
@@ -31509,28 +31287,6 @@
"version": "0.5.1",
"dev": true
},
"node_modules/rc": {
"version": "1.2.8",
"license": "(BSD-2-Clause OR MIT OR Apache-2.0)",
"optional": true,
"dependencies": {
"deep-extend": "^0.6.0",
"ini": "~1.3.0",
"minimist": "^1.2.0",
"strip-json-comments": "~2.0.1"
},
"bin": {
"rc": "cli.js"
}
},
"node_modules/rc/node_modules/strip-json-comments": {
"version": "2.0.1",
"license": "MIT",
"optional": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/re-resizable": {
"version": "6.9.11",
"license": "MIT",
@@ -34439,45 +34195,6 @@
"node": ">=10"
}
},
"node_modules/tar-fs": {
"version": "2.1.1",
"license": "MIT",
"optional": true,
"dependencies": {
"chownr": "^1.1.1",
"mkdirp-classic": "^0.5.2",
"pump": "^3.0.0",
"tar-stream": "^2.1.4"
}
},
"node_modules/tar-stream": {
"version": "2.2.0",
"license": "MIT",
"optional": true,
"dependencies": {
"bl": "^4.0.3",
"end-of-stream": "^1.4.1",
"fs-constants": "^1.0.0",
"inherits": "^2.0.3",
"readable-stream": "^3.1.1"
},
"engines": {
"node": ">=6"
}
},
"node_modules/tar-stream/node_modules/readable-stream": {
"version": "3.6.2",
"license": "MIT",
"optional": true,
"dependencies": {
"inherits": "^2.0.3",
"string_decoder": "^1.1.1",
"util-deprecate": "^1.0.1"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/tar/node_modules/chownr": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/chownr/-/chownr-2.0.0.tgz",
@@ -36815,13 +36532,6 @@
"integrity": "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw==",
"dev": true
},
"node_modules/xpath.js": {
"version": "1.1.0",
"license": "MIT",
"engines": {
"node": ">=0.4.0"
}
},
"node_modules/xtend": {
"version": "4.0.2",
"license": "MIT",

View File

@@ -7,8 +7,7 @@
"@azure/arm-cosmosdb": "9.1.0",
"@azure/cosmos": "4.2.0-beta.1",
"@azure/cosmos-language-service": "0.0.5",
"@azure/identity": "1.5.2",
"@azure/ms-rest-nodeauth": "3.1.1",
"@azure/identity": "4.5.0",
"@azure/msal-browser": "2.14.2",
"@babel/plugin-proposal-class-properties": "7.12.1",
"@babel/plugin-proposal-decorators": "7.12.12",

View File

@@ -1,7 +1,7 @@
[defaults]
group = stfaul
sku = P1v2
appserviceplan = stfaul_asp_Linux_centralus_0
location = centralus
web = cosmos-explorer-preview
group = dataexplorer-preview
sku = P1V2
appserviceplan = dataexplorer-preview
location = westus2
web = dataexplorer-preview

View File

@@ -4,8 +4,8 @@ Cosmos Explorer Preview makes it possible to try a working version of any commit
Initial support is for Hosted (Connection string only) or the Azure Portal. Examples:
Connection string URLs: https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html
Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home
Connection string URLs: https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html
Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home
In both cases replace `COMMIT_SHA` with the commit you want to view. It must have already completed its build on GitHub Actions.

View File

@@ -1,4 +1,4 @@
{
"PROXY_PATH": "/proxy",
"msalRedirectURI": "https://cosmos-explorer-preview.azurewebsites.net/"
"msalRedirectURI": "https://dataexplorer-preview.azurewebsites.net/"
}

View File

@@ -3,8 +3,15 @@ const { createProxyMiddleware } = require("http-proxy-middleware");
const port = process.env.PORT || 3000;
const fetch = require("node-fetch");
const api = createProxyMiddleware("/api", {
target: "https://cdb-ms-mpac-pbe.cosmos.azure.com",
const backendEndpoint = "https://cdb-ms-mpac-pbe.cosmos.azure.com";
const previewSiteEndpoint = "https://dataexplorer-preview.azurewebsites.net";
const previewStorageWebsiteEndpoint = "https://dataexplorerpreview.z5.web.core.windows.net/";
const githubApiUrl = "https://api.github.com/repos/Azure/cosmos-explorer";
const githubPullRequestUrl = "https://github.com/Azure/cosmos-explorer/pull";
const azurePortalMpacEndpoint = "https://ms.portal.azure.com/";
const api = createProxyMiddleware({
target: backendEndpoint,
changeOrigin: true,
logLevel: "debug",
bypass: (req, res) => {
@@ -15,8 +22,8 @@ const api = createProxyMiddleware("/api", {
},
});
const proxy = createProxyMiddleware("/proxy", {
target: "https://cdb-ms-mpac-pbe.cosmos.azure.com",
const proxy = createProxyMiddleware({
target: backendEndpoint,
changeOrigin: true,
secure: false,
logLevel: "debug",
@@ -27,35 +34,38 @@ const proxy = createProxyMiddleware("/proxy", {
},
});
const commit = createProxyMiddleware("/commit", {
target: "https://cosmosexplorerpreview.blob.core.windows.net",
const commit = createProxyMiddleware({
target: previewStorageWebsiteEndpoint,
changeOrigin: true,
secure: false,
logLevel: "debug",
pathRewrite: { "^/commit": "$web/" },
pathRewrite: { "^/commit": "/" },
});
const app = express();
app.use(api);
app.use(proxy);
app.use(commit);
app.use("/api", api);
app.use("/proxy", proxy);
app.use("/commit", commit);
app.get("/pull/:pr(\\d+)", (req, res) => {
const pr = req.params.pr;
if (!/^\d+$/.test(pr)) {
return res.status(400).send("Invalid pull request number");
}
const [, query] = req.originalUrl.split("?");
const search = new URLSearchParams(query);
fetch("https://api.github.com/repos/Azure/cosmos-explorer/pulls/" + pr)
fetch(`${githubApiUrl}/pulls/${pr}`)
.then((response) => response.json())
.then(({ head: { ref, sha } }) => {
const prUrl = new URL("https://github.com/Azure/cosmos-explorer/pull/" + pr);
const prUrl = new URL(`${githubPullRequestUrl}/${pr}`);
prUrl.hash = ref;
search.set("feature.pr", prUrl.href);
const explorer = new URL("https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/explorer.html");
const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/explorer.html`);
explorer.search = search.toString();
const portal = new URL("https://ms.portal.azure.com/");
const portal = new URL(azurePortalMpacEndpoint);
portal.searchParams.set("dataExplorerSource", explorer.href);
return res.redirect(portal.href);
@@ -63,12 +73,10 @@ app.get("/pull/:pr(\\d+)", (req, res) => {
.catch(() => res.sendStatus(500));
});
app.get("/", (req, res) => {
fetch("https://api.github.com/repos/Azure/cosmos-explorer/branches/master")
fetch(`${githubApiUrl}/branches/master`)
.then((response) => response.json())
.then(({ commit: { sha } }) => {
const explorer = new URL(
"https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/hostedExplorer.html"
);
const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/hostedExplorer.html`);
return res.redirect(explorer.href);
})
.catch(() => res.sendStatus(500));

1360
preview/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@
"description": "",
"main": "index.js",
"scripts": {
"deploy": "az webapp up --name \"cosmos-explorer-preview\" --subscription \"cosmosdb-portalteam-generaltest-msft\" --resource-group \"stfaul\"",
"deploy": "az webapp up --name \"dataexplorer-preview\" --subscription \"cosmosdb-portalteam-runners\" --resource-group \"dataexplorer-preview\" --runtime \"NODE:18-lts\" --sku P1V2",
"start": "node index.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
@@ -12,7 +12,8 @@
"author": "Microsoft Corporation",
"dependencies": {
"express": "^4.17.1",
"http-proxy-middleware": "^1.1.0",
"http-proxy-middleware": "^3.0.3",
"node": "^18.20.6",
"node-fetch": "^2.6.1"
}
}
}

View File

@@ -97,6 +97,12 @@ export enum CapacityMode {
Serverless = "Serverless",
}
export enum WorkloadType {
Learning = "Learning",
DevelopmentTesting = "Development/Testing",
Production = "Production",
None = "None",
}
// flight names returned from the portal are always lowercase
export class Flights {
public static readonly SettingsV2 = "settingsv2";
@@ -119,6 +125,7 @@ export class AfecFeatures {
export class TagNames {
public static defaultExperience: string = "defaultExperience";
public static WorkloadType: string = "hidden-workload-type";
}
export class MongoDBAccounts {
@@ -518,6 +525,11 @@ export class PriorityLevel {
public static readonly Default = "low";
}
export class ariaLabelForLearnMoreLink {
public static readonly AnalyticalStore = "Learn more about analytical store.";
public static readonly AzureSynapseLink = "Learn more about Azure Synapse Link.";
}
export const QueryCopilotSampleDatabaseId = "CopilotSampleDB";
export const QueryCopilotSampleContainerId = "SampleContainer";

View File

@@ -3,12 +3,12 @@ import { getAuthorizationTokenUsingResourceTokens } from "Common/getAuthorizatio
import { AuthorizationToken } from "Contracts/FabricMessageTypes";
import { checkDatabaseResourceTokensValidity } from "Platform/Fabric/FabricUtil";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { AuthType } from "../AuthType";
import { BackendApi, PriorityLevel } from "../Common/Constants";
import { PriorityLevel } from "../Common/Constants";
import * as Logger from "../Common/Logger";
import { Platform, configContext } from "../ConfigContext";
import { updateUserContext, userContext } from "../UserContext";
import { isDataplaneRbacSupported } from "../Utils/APITypeUtils";
import { logConsoleError } from "../Utils/NotificationConsoleUtils";
import * as PriorityBasedExecutionUtils from "../Utils/PriorityBasedExecutionUtils";
import { EmulatorMasterKey, HttpHeaders } from "./Constants";
@@ -19,7 +19,7 @@ const _global = typeof self === "undefined" ? window : self;
export const tokenProvider = async (requestInfo: Cosmos.RequestInfo) => {
const { verb, resourceId, resourceType, headers } = requestInfo;
const dataPlaneRBACOptionEnabled = userContext.dataPlaneRbacEnabled && userContext.apiType === "SQL";
const dataPlaneRBACOptionEnabled = userContext.dataPlaneRbacEnabled && isDataplaneRbacSupported(userContext.apiType);
if (userContext.features.enableAadDataPlane || dataPlaneRBACOptionEnabled) {
Logger.logInfo(
`AAD Data Plane Feature flag set to ${userContext.features.enableAadDataPlane} for account with disable local auth ${userContext.databaseAccount.properties.disableLocalAuth} `,
@@ -125,10 +125,6 @@ export async function getTokenFromAuthService(
resourceType: string,
resourceId?: string,
): Promise<AuthorizationToken> {
if (!useNewPortalBackendEndpoint(BackendApi.RuntimeProxy)) {
return getTokenFromAuthService_ToBeDeprecated(verb, resourceType, resourceId);
}
try {
const host: string = configContext.PORTAL_BACKEND_ENDPOINT;
const response: Response = await _global.fetch(host + "/api/connectionstring/runtimeproxy/authorizationtokens", {
@@ -151,34 +147,6 @@ export async function getTokenFromAuthService(
}
}
export async function getTokenFromAuthService_ToBeDeprecated(
verb: string,
resourceType: string,
resourceId?: string,
): Promise<AuthorizationToken> {
try {
const host = configContext.BACKEND_ENDPOINT;
const response = await _global.fetch(host + "/api/guest/runtimeproxy/authorizationTokens", {
method: "POST",
headers: {
"content-type": "application/json",
"x-ms-encrypted-auth-token": userContext.accessToken,
},
body: JSON.stringify({
verb,
resourceType,
resourceId,
}),
});
//TODO I am not sure why we have to parse the JSON again here. fetch should do it for us when we call .json()
const result = JSON.parse(await response.json());
return result;
} catch (error) {
logConsoleError(`Failed to get authorization headers for ${resourceType}: ${getErrorMessage(error)}`);
return Promise.reject(error);
}
}
// The Capability is a bitmap, which cosmosdb backend decodes as per the below enum
enum SDKSupportedCapabilities {
None = 0,
@@ -203,8 +171,10 @@ export function client(): Cosmos.CosmosClient {
}
let _defaultHeaders: Cosmos.CosmosHeaders = {};
_defaultHeaders["x-ms-cosmos-sdk-supportedcapabilities"] =
SDKSupportedCapabilities.None | SDKSupportedCapabilities.PartitionMerge;
_defaultHeaders["x-ms-cosmos-throughput-bucket"] = 1;
if (
userContext.authType === AuthType.ConnectionString ||

View File

@@ -0,0 +1,34 @@
import { WorkloadType } from "Common/Constants";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { DatabaseAccount, Tags } from "Contracts/DataModels";
import { updateUserContext } from "UserContext";
describe("Database Account Utility", () => {
describe("Workload Type", () => {
beforeEach(() => {
updateUserContext({
databaseAccount: {
tags: {} as Tags,
} as DatabaseAccount,
});
});
it("Workload Type should return Learning", () => {
updateUserContext({
databaseAccount: {
tags: {
"hidden-workload-type": WorkloadType.Learning,
} as Tags,
} as DatabaseAccount,
});
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.Learning);
});
it("Workload Type should return None", () => {
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.None);
});
});
});

View File

@@ -1,3 +1,5 @@
import { TagNames, WorkloadType } from "Common/Constants";
import { Tags } from "Contracts/DataModels";
import { userContext } from "../UserContext";
function isVirtualNetworkFilterEnabled() {
@@ -15,3 +17,12 @@ function isPrivateEndpointConnectionsEnabled() {
export function isPublicInternetAccessAllowed(): boolean {
return !isVirtualNetworkFilterEnabled() && !isIpRulesEnabled() && !isPrivateEndpointConnectionsEnabled();
}
export function getWorkloadType(): WorkloadType {
const tags: Tags = userContext?.databaseAccount?.tags;
const workloadType: WorkloadType = tags && (tags[TagNames.WorkloadType] as WorkloadType);
if (!workloadType) {
return WorkloadType.None;
}
return workloadType;
}

View File

@@ -4,16 +4,8 @@ import { configContext, resetConfigContext, updateConfigContext } from "../Confi
import { DatabaseAccount } from "../Contracts/DataModels";
import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId";
import { extractFeatures } from "../Platform/Hosted/extractFeatures";
import { updateUserContext } from "../UserContext";
import {
deleteDocument,
getEndpoint,
getFeatureEndpointOrDefault,
queryDocuments,
readDocument,
updateDocument,
} from "./MongoProxyClient";
import { deleteDocuments, getEndpoint, queryDocuments, readDocument, updateDocument } from "./MongoProxyClient";
const databaseId = "testDB";
@@ -196,20 +188,8 @@ describe("MongoProxyClient", () => {
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({
MONGO_BACKEND_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
updateDocument(databaseId, collection, documentId, "{}");
expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
});
describe("deleteDocument", () => {
describe("deleteDocuments", () => {
beforeEach(() => {
resetConfigContext();
updateUserContext({
@@ -226,9 +206,9 @@ describe("MongoProxyClient", () => {
});
it("builds the correct URL", () => {
deleteDocument(databaseId, collection, documentId);
deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object),
);
});
@@ -238,9 +218,9 @@ describe("MongoProxyClient", () => {
MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
deleteDocument(databaseId, collection, documentId);
deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith(
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object),
);
});
@@ -275,33 +255,4 @@ describe("MongoProxyClient", () => {
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/connectionstring/mongo/explorer`);
});
});
describe("getFeatureEndpointOrDefault", () => {
beforeEach(() => {
resetConfigContext();
updateConfigContext({
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
const params = new URLSearchParams({
"feature.mongoProxyEndpoint": MongoProxyEndpoints.Prod,
"feature.mongoProxyAPIs": "readDocument|createDocument",
});
const features = extractFeatures(params);
updateUserContext({
authType: AuthType.AAD,
features: features,
});
});
it("returns a local endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("readDocument");
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`);
});
it("returns a production endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("DeleteDocument");
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`);
});
});
});

View File

@@ -1,20 +1,13 @@
import { Constants as CosmosSDKConstants } from "@azure/cosmos";
import {
allowedMongoProxyEndpoints_ToBeDeprecated,
defaultAllowedMongoProxyEndpoints,
validateEndpoint,
} from "Utils/EndpointUtils";
import queryString from "querystring";
import { AuthType } from "../AuthType";
import { configContext } from "../ConfigContext";
import * as DataModels from "../Contracts/DataModels";
import { MessageTypes } from "../Contracts/ExplorerContracts";
import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId";
import { hasFlag } from "../Platform/Hosted/extractFeatures";
import { userContext } from "../UserContext";
import { logConsoleError } from "../Utils/NotificationConsoleUtils";
import { ApiType, ContentType, HttpHeaders, HttpStatusCodes, MongoProxyApi, MongoProxyEndpoints } from "./Constants";
import { ApiType, ContentType, HttpHeaders, HttpStatusCodes } from "./Constants";
import { MinimalQueryIterator } from "./IteratorUtilities";
import { sendMessage } from "./MessageHandler";
@@ -67,10 +60,6 @@ export function queryDocuments(
query: string,
continuationToken?: string,
): Promise<QueryResponse> {
if (!useMongoProxyEndpoint(MongoProxyApi.ResourceList) || !useMongoProxyEndpoint(MongoProxyApi.QueryDocuments)) {
return queryDocuments_ToBeDeprecated(databaseId, collection, isResourceList, query, continuationToken);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
@@ -89,7 +78,7 @@ export function queryDocuments(
query,
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.ResourceList) || "";
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT) || "";
const headers = {
...defaultHeaders,
@@ -127,76 +116,11 @@ export function queryDocuments(
});
}
function queryDocuments_ToBeDeprecated(
databaseId: string,
collection: Collection,
isResourceList: boolean,
query: string,
continuationToken?: string,
): Promise<QueryResponse> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
collection && collection.partitionKey && !collection.partitionKey.systemKey
? collection.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("resourcelist") || "";
const headers = {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.IsQuery]: "true",
[CosmosSDKConstants.HttpHeaders.PopulateQueryMetrics]: "true",
[CosmosSDKConstants.HttpHeaders.EnableScanInQuery]: "true",
[CosmosSDKConstants.HttpHeaders.EnableCrossPartitionQuery]: "true",
[CosmosSDKConstants.HttpHeaders.ParallelizeCrossPartitionQuery]: "true",
[HttpHeaders.contentType]: "application/query+json",
};
if (continuationToken) {
headers[CosmosSDKConstants.HttpHeaders.Continuation] = continuationToken;
}
const path = isResourceList ? "/resourcelist" : "";
return window
.fetch(`${endpoint}${path}?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify({ query }),
headers,
})
.then(async (response) => {
if (response.ok) {
return {
continuationToken: response.headers.get(CosmosSDKConstants.HttpHeaders.Continuation),
documents: (await response.json()).Documents as DataModels.DocumentId[],
headers: response.headers,
};
}
await errorHandling(response, "querying documents", params);
return undefined;
});
}
export function readDocument(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.ReadDocument)) {
return readDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
@@ -217,7 +141,7 @@ export function readDocument(
: "",
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.ReadDocument);
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(endpoint, {
@@ -237,61 +161,12 @@ export function readDocument(
});
}
export function readDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 4).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("readDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "GET",
headers: {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.PartitionKey]: encodeURIComponent(
JSON.stringify(documentId.partitionKeyHeader()),
),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "reading document", params);
});
}
export function createDocument(
databaseId: string,
collection: Collection,
partitionKeyProperty: string,
documentContent: unknown,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.CreateDocument)) {
return createDocument_ToBeDeprecated(databaseId, collection, partitionKeyProperty, documentContent);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
@@ -308,7 +183,7 @@ export function createDocument(
documentContent: JSON.stringify(documentContent),
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.CreateDocument);
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/createDocument`, {
@@ -328,54 +203,12 @@ export function createDocument(
});
}
export function createDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
partitionKeyProperty: string,
documentContent: unknown,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk: collection && collection.partitionKey && !collection.partitionKey.systemKey ? partitionKeyProperty : "",
};
const endpoint = getFeatureEndpointOrDefault("createDocument");
return window
.fetch(`${endpoint}/resourcelist?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify(documentContent),
headers: {
...defaultHeaders,
...authHeaders(),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating document", params);
});
}
export function updateDocument(
databaseId: string,
collection: Collection,
documentId: DocumentId,
documentContent: string,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint(MongoProxyApi.UpdateDocument)) {
return updateDocument_ToBeDeprecated(databaseId, collection, documentId, documentContent);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
@@ -396,7 +229,7 @@ export function updateDocument(
: "",
documentContent,
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.UpdateDocument);
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(endpoint, {
@@ -417,139 +250,6 @@ export function updateDocument(
});
}
export function updateDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
documentContent: string,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("updateDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "PUT",
body: documentContent,
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "updating document", params);
});
}
export function deleteDocument(databaseId: string, collection: Collection, documentId: DocumentId): Promise<void> {
if (!useMongoProxyEndpoint(MongoProxyApi.DeleteDocument)) {
return deleteDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
databaseID: databaseId,
collectionID: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
resourceID: rid,
resourceType: "docs",
subscriptionID: userContext.subscriptionId,
resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name,
partitionKey:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.DeleteDocument);
return window
.fetch(endpoint, {
method: "DELETE",
body: JSON.stringify(params),
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<void> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("deleteDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "DELETE",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocuments(
databaseId: string,
collection: Collection,
@@ -575,7 +275,7 @@ export function deleteDocuments(
resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name,
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.BulkDelete);
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/bulkdelete`, {
@@ -599,9 +299,6 @@ export function deleteDocuments(
export function createMongoCollectionWithProxy(
params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> {
if (!useMongoProxyEndpoint(MongoProxyApi.CreateCollectionWithProxy)) {
return createMongoCollectionWithProxy_ToBeDeprecated(params);
}
const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0];
@@ -622,7 +319,7 @@ export function createMongoCollectionWithProxy(
isSharded: !!shardKey,
};
const endpoint = getFeatureEndpointOrDefault(MongoProxyApi.CreateCollectionWithProxy);
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/createCollection`, {
@@ -642,70 +339,6 @@ export function createMongoCollectionWithProxy(
});
}
export function createMongoCollectionWithProxy_ToBeDeprecated(
params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> {
const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0];
const mongoParams: DataModels.MongoParameters = {
resourceUrl: databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint,
db: params.databaseId,
coll: params.collectionId,
pk: shardKey,
offerThroughput: params.autoPilotMaxThroughput || params.offerThroughput,
cd: params.createNewDatabase,
st: params.databaseLevelThroughput,
is: !!shardKey,
rid: "",
rtype: "colls",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
isAutoPilot: !!params.autoPilotMaxThroughput,
};
const endpoint = getFeatureEndpointOrDefault("createCollectionWithProxy");
return window
.fetch(
`${endpoint}/createCollection?${queryString.stringify(
mongoParams as unknown as queryString.ParsedUrlQueryInput,
)}`,
{
method: "POST",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: "application/json",
},
},
)
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating collection", mongoParams);
});
}
export function getFeatureEndpointOrDefault(feature: string): string {
let endpoint;
if (useMongoProxyEndpoint(feature)) {
endpoint = configContext.MONGO_PROXY_ENDPOINT;
} else {
const allowedMongoProxyEndpoints = configContext.allowedMongoProxyEndpoints || [
...defaultAllowedMongoProxyEndpoints,
...allowedMongoProxyEndpoints_ToBeDeprecated,
];
endpoint =
hasFlag(userContext.features.mongoProxyAPIs, feature) &&
validateEndpoint(userContext.features.mongoProxyEndpoint, allowedMongoProxyEndpoints)
? userContext.features.mongoProxyEndpoint
: configContext.MONGO_BACKEND_ENDPOINT || configContext.BACKEND_ENDPOINT;
}
return getEndpoint(endpoint);
}
export function getEndpoint(endpoint: string): string {
let url = endpoint + "/api/mongo/explorer";
@@ -719,84 +352,6 @@ export function getEndpoint(endpoint: string): string {
return url;
}
export function useMongoProxyEndpoint(mongoProxyApi: string): boolean {
const mongoProxyEnvironmentMap: { [key: string]: string[] } = {
[MongoProxyApi.ResourceList]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.QueryDocuments]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.CreateDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.ReadDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.UpdateDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.DeleteDocument]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.CreateCollectionWithProxy]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.LegacyMongoShell]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
[MongoProxyApi.BulkDelete]: [
MongoProxyEndpoints.Development,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
],
};
if (!mongoProxyEnvironmentMap[mongoProxyApi] || !configContext.MONGO_PROXY_ENDPOINT) {
return false;
}
if (configContext.globallyEnabledMongoAPIs.includes(mongoProxyApi)) {
return true;
}
return mongoProxyEnvironmentMap[mongoProxyApi].includes(configContext.MONGO_PROXY_ENDPOINT);
}
export class ThrottlingError extends Error {
constructor(message: string) {
super(message);

View File

@@ -105,6 +105,8 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
? parseInt(resource.softAllowedMaximumThroughput)
: resource.softAllowedMaximumThroughput;
const throughputBuckets = resource?.throughputBuckets;
if (autoscaleSettings) {
return {
id: offerId,
@@ -114,6 +116,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput,
softAllowedMaximumThroughput,
throughputBuckets,
};
}
@@ -125,6 +128,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput,
softAllowedMaximumThroughput,
throughputBuckets,
};
}

View File

@@ -1,6 +1,6 @@
import { OfferDefinition, RequestOptions } from "@azure/cosmos";
import { AuthType } from "../../AuthType";
import { Offer, SDKOfferDefinition, UpdateOfferParams } from "../../Contracts/DataModels";
import { Offer, SDKOfferDefinition, ThroughputBucket, UpdateOfferParams } from "../../Contracts/DataModels";
import { userContext } from "../../UserContext";
import {
migrateCassandraKeyspaceToAutoscale,
@@ -359,6 +359,13 @@ const createUpdateOfferBody = (params: UpdateOfferParams): ThroughputSettingsUpd
body.properties.resource.throughput = params.manualThroughput;
}
if (params.throughputBuckets) {
const throughputBuckets = params.throughputBuckets.filter(
(bucket: ThroughputBucket) => bucket.maxThroughputPercentage !== 100,
);
body.properties.resource.throughputBuckets = throughputBuckets;
}
return body;
};

View File

@@ -12,7 +12,6 @@ import {
allowedGraphEndpoints,
allowedHostedExplorerEndpoints,
allowedJunoOrigins,
allowedMongoBackendEndpoints,
allowedMsalRedirectEndpoints,
defaultAllowedArmEndpoints,
defaultAllowedBackendEndpoints,
@@ -50,10 +49,8 @@ export interface ConfigContext {
CATALOG_API_KEY: string;
ARCADIA_ENDPOINT: string;
ARCADIA_LIVY_ENDPOINT_DNS_ZONE: string;
BACKEND_ENDPOINT?: string;
PORTAL_BACKEND_ENDPOINT: string;
NEW_BACKEND_APIS?: BackendApi[];
MONGO_BACKEND_ENDPOINT?: string;
MONGO_PROXY_ENDPOINT: string;
CASSANDRA_PROXY_ENDPOINT: string;
NEW_CASSANDRA_APIS?: string[];
@@ -91,7 +88,7 @@ let configContext: Readonly<ConfigContext> = {
`^https:\\/\\/.*\\.analysis-df\\.net$`,
`^https:\\/\\/.*\\.analysis-df\\.windows\\.net$`,
`^https:\\/\\/.*\\.azure-test\\.net$`,
`^https:\\/\\/cosmos-explorer-preview\\.azurewebsites\\.net$`,
`^https:\\/\\/dataexplorer-preview\\.azurewebsites\\.net$`,
], // Webpack injects this at build time
gitSha: process.env.GIT_SHA,
hostedExplorerURL: "https://cosmos.azure.com/",
@@ -109,7 +106,6 @@ let configContext: Readonly<ConfigContext> = {
GITHUB_CLIENT_ID: "6cb2f63cf6f7b5cbdeca", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1189306
GITHUB_TEST_ENV_CLIENT_ID: "b63fc8cbf87fd3c6e2eb", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1777772
JUNO_ENDPOINT: JunoEndpoints.Prod,
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Prod,
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
CASSANDRA_PROXY_ENDPOINT: CassandraProxyEndpoints.Prod,
@@ -152,15 +148,6 @@ export function updateConfigContext(newContext: Partial<ConfigContext>): void {
delete newContext.ARCADIA_ENDPOINT;
}
if (
!validateEndpoint(
newContext.BACKEND_ENDPOINT,
configContext.allowedBackendEndpoints || defaultAllowedBackendEndpoints,
)
) {
delete newContext.BACKEND_ENDPOINT;
}
if (
!validateEndpoint(
newContext.MONGO_PROXY_ENDPOINT,
@@ -170,10 +157,6 @@ export function updateConfigContext(newContext: Partial<ConfigContext>): void {
delete newContext.MONGO_PROXY_ENDPOINT;
}
if (!validateEndpoint(newContext.MONGO_BACKEND_ENDPOINT, allowedMongoBackendEndpoints)) {
delete newContext.MONGO_BACKEND_ENDPOINT;
}
if (
!validateEndpoint(
newContext.CASSANDRA_PROXY_ENDPOINT,

View File

@@ -6,6 +6,7 @@ export interface ArmEntity {
location: string;
type: string;
kind: string;
tags?: Tags;
}
export interface DatabaseAccount extends ArmEntity {
@@ -274,6 +275,12 @@ export interface Offer {
offerReplacePending: boolean;
instantMaximumThroughput?: number;
softAllowedMaximumThroughput?: number;
throughputBuckets?: ThroughputBucket[];
}
export interface ThroughputBucket {
id: number;
maxThroughputPercentage: number;
}
export interface SDKOfferDefinition extends Resource {
@@ -396,6 +403,7 @@ export interface UpdateOfferParams {
collectionId?: string;
migrateToAutoPilot?: boolean;
migrateToManual?: boolean;
throughputBuckets?: ThroughputBucket[];
}
export interface Notification {
@@ -663,3 +671,5 @@ export interface FeatureRegistration {
state: string;
};
}
export type Tags = { [key: string]: string };

View File

@@ -41,7 +41,7 @@ export enum MessageTypes {
OpenPostgreSQLPasswordReset,
OpenPostgresNetworkingBlade,
OpenCosmosDBNetworkingBlade,
DisplayNPSSurvey,
DisplayNPSSurvey, // unused
OpenVCoreMongoNetworkingBlade,
OpenVCoreMongoConnectionStringsBlade,
GetAuthorizationToken, // unused. Can be removed if the portal uses the same list of enums.

View File

@@ -406,7 +406,6 @@ export interface DataExplorerInputsFrame {
csmEndpoint?: string;
dnsSuffix?: string;
serverId?: string;
extensionEndpoint?: string;
portalBackendEndpoint?: string;
mongoProxyEndpoint?: string;
cassandraProxyEndpoint?: string;

View File

@@ -1,5 +1,7 @@
import { AuthType } from "AuthType";
import { shallow } from "enzyme";
import ko from "knockout";
import { Features } from "Platform/Hosted/extractFeatures";
import React from "react";
import { updateCollection } from "../../../Common/dataAccess/updateCollection";
import { updateOffer } from "../../../Common/dataAccess/updateOffer";
@@ -247,4 +249,42 @@ describe("SettingsComponent", () => {
expect(conflictResolutionPolicy.mode).toEqual(DataModels.ConflictResolutionMode.Custom);
expect(conflictResolutionPolicy.conflictResolutionProcedure).toEqual(expectSprocPath);
});
it("should save throughput bucket changes when Save button is clicked", async () => {
updateUserContext({
apiType: "SQL",
features: { enableThroughputBuckets: true } as Features,
authType: AuthType.AAD,
});
const wrapper = shallow(<SettingsComponent {...baseProps} />);
const settingsComponentInstance = wrapper.instance() as SettingsComponent;
const isEnabled = settingsComponentInstance["throughputBucketsEnabled"];
expect(isEnabled).toBe(true);
wrapper.setState({
isThroughputBucketsSaveable: true,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
await settingsComponentInstance.onSaveClick();
expect(updateOffer).toHaveBeenCalledWith({
databaseId: collection.databaseId,
collectionId: collection.id(),
currentOffer: expect.any(Object),
autopilotThroughput: collection.offer().autoscaleMaxThroughput,
manualThroughput: collection.offer().manualThroughput,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
expect(wrapper.state("isThroughputBucketsSaveable")).toBe(false);
});
});

View File

@@ -7,6 +7,10 @@ import {
ContainerPolicyComponent,
ContainerPolicyComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ContainerPolicyComponent";
import {
ThroughputBucketsComponent,
ThroughputBucketsComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ThroughputInputComponents/ThroughputBucketsComponent";
import { useDatabases } from "Explorer/useDatabases";
import { isFullTextSearchEnabled, isVectorSearchEnabled } from "Utils/CapabilityUtils";
import { isRunningOnPublicCloud } from "Utils/CloudUtils";
@@ -86,6 +90,8 @@ export interface SettingsComponentState {
wasAutopilotOriginallySet: boolean;
isScaleSaveable: boolean;
isScaleDiscardable: boolean;
throughputBuckets: DataModels.ThroughputBucket[];
throughputBucketsBaseline: DataModels.ThroughputBucket[];
throughputError: string;
timeToLive: TtlType;
@@ -104,6 +110,7 @@ export interface SettingsComponentState {
changeFeedPolicyBaseline: ChangeFeedPolicyState;
isSubSettingsSaveable: boolean;
isSubSettingsDiscardable: boolean;
isThroughputBucketsSaveable: boolean;
vectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy;
vectorEmbeddingPolicyBaseline: DataModels.VectorEmbeddingPolicy;
@@ -158,6 +165,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private isVectorSearchEnabled: boolean;
private isFullTextSearchEnabled: boolean;
private totalThroughputUsed: number;
private throughputBucketsEnabled: boolean;
public mongoDBCollectionResource: MongoDBCollectionResource;
constructor(props: SettingsComponentProps) {
@@ -175,6 +183,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.isFullTextSearchEnabled = isFullTextSearchEnabled() && !hasDatabaseSharedThroughput(this.collection);
this.changeFeedPolicyVisible = userContext.features.enableChangeFeedPolicy;
this.throughputBucketsEnabled =
userContext.apiType === "SQL" &&
userContext.features.enableThroughputBuckets &&
userContext.authType === AuthType.AAD;
// Mongo container with system partition key still treat as "Fixed"
this.isFixedContainer =
@@ -193,6 +205,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
wasAutopilotOriginallySet: false,
isScaleSaveable: false,
isScaleDiscardable: false,
throughputBuckets: undefined,
throughputBucketsBaseline: undefined,
throughputError: undefined,
timeToLive: undefined,
@@ -211,6 +225,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
changeFeedPolicyBaseline: undefined,
isSubSettingsSaveable: false,
isSubSettingsDiscardable: false,
isThroughputBucketsSaveable: false,
vectorEmbeddingPolicy: undefined,
vectorEmbeddingPolicyBaseline: undefined,
@@ -327,7 +342,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable)
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable) ||
this.state.isThroughputBucketsSaveable
);
};
@@ -339,7 +355,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable)
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable) ||
this.state.isThroughputBucketsSaveable
);
};
@@ -419,6 +436,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({
throughput: this.state.throughputBaseline,
throughputBuckets: this.state.throughputBucketsBaseline,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
timeToLive: this.state.timeToLiveBaseline,
timeToLiveSeconds: this.state.timeToLiveSecondsBaseline,
displayedTtlSeconds: this.state.displayedTtlSecondsBaseline,
@@ -441,6 +460,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
isScaleSaveable: false,
isScaleDiscardable: false,
isSubSettingsSaveable: false,
isThroughputBucketsSaveable: false,
isSubSettingsDiscardable: false,
isContainerPolicyDirty: false,
isIndexingPolicyDirty: false,
@@ -479,6 +499,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private onIndexingPolicyContentChange = (newIndexingPolicy: DataModels.IndexingPolicy): void =>
this.setState({ indexingPolicyContent: newIndexingPolicy });
private onThroughputBucketsSaveableChange = (isSaveable: boolean): void => {
this.setState({ isThroughputBucketsSaveable: isSaveable });
};
private resetShouldDiscardContainerPolicies = (): void => this.setState({ shouldDiscardContainerPolicies: false });
private resetShouldDiscardIndexingPolicy = (): void => this.setState({ shouldDiscardIndexingPolicy: false });
@@ -749,9 +773,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
] as DataModels.ComputedProperties;
}
const throughputBuckets = this.offer?.throughputBuckets;
return {
throughput: offerThroughput,
throughputBaseline: offerThroughput,
throughputBuckets,
throughputBucketsBaseline: throughputBuckets,
changeFeedPolicy: changeFeedPolicy,
changeFeedPolicyBaseline: changeFeedPolicy,
timeToLive: timeToLive,
@@ -839,6 +867,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({ throughput: newThroughput, throughputError });
};
private onThroughputBucketChange = (throughputBuckets: DataModels.ThroughputBucket[]): void => {
this.setState({ throughputBuckets });
};
private onAutoPilotSelected = (isAutoPilotSelected: boolean): void =>
this.setState({ isAutoPilotSelected: isAutoPilotSelected });
@@ -1029,6 +1061,24 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
}
}
if (this.throughputBucketsEnabled && this.state.isThroughputBucketsSaveable) {
const updatedOffer: DataModels.Offer = await updateOffer({
databaseId: this.collection.databaseId,
collectionId: this.collection.id(),
currentOffer: this.collection.offer(),
autopilotThroughput: this.collection.offer().autoscaleMaxThroughput
? this.collection.offer().autoscaleMaxThroughput
: undefined,
manualThroughput: this.collection.offer().manualThroughput
? this.collection.offer().manualThroughput
: undefined,
throughputBuckets: this.state.throughputBuckets,
});
this.collection.offer(updatedOffer);
this.offer = updatedOffer;
this.setState({ isThroughputBucketsSaveable: false });
}
if (this.state.isScaleSaveable) {
const updateOfferParams: DataModels.UpdateOfferParams = {
databaseId: this.collection.databaseId,
@@ -1209,6 +1259,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
onConflictResolutionDirtyChange: this.onConflictResolutionDirtyChange,
};
const throughputBucketsComponentProps: ThroughputBucketsComponentProps = {
currentBuckets: this.state.throughputBuckets,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
onBucketsChange: this.onThroughputBucketChange,
onSaveableChange: this.onThroughputBucketsSaveableChange,
};
const partitionKeyComponentProps: PartitionKeyComponentProps = {
database: useDatabases.getState().findDatabaseWithId(this.collection.databaseId),
collection: this.collection,
@@ -1271,6 +1328,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
});
}
if (this.throughputBucketsEnabled) {
tabs.push({
tab: SettingsV2TabTypes.ThroughputBucketsTab,
content: <ThroughputBucketsComponent {...throughputBucketsComponentProps} />,
});
}
const pivotProps: IPivotProps = {
onLinkClick: this.onPivotChange,
selectedKey: SettingsV2TabTypes[this.state.selectedTab],

View File

@@ -14,6 +14,7 @@ import * as ViewModels from "../../../../Contracts/ViewModels";
import { handleError } from "Common/ErrorHandlingUtils";
import { cancelDataTransferJob, pollDataTransferJob } from "Common/dataAccess/dataTransfers";
import { Platform, configContext } from "ConfigContext";
import Explorer from "Explorer/Explorer";
import { ChangePartitionKeyPane } from "Explorer/Panes/ChangePartitionKeyPane/ChangePartitionKeyPane";
import {
@@ -177,12 +178,14 @@ export const PartitionKeyComponent: React.FC<PartitionKeyComponentProps> = ({ da
To change the partition key, a new destination container must be created or an existing destination container
selected. Data will then be copied to the destination container.
</Text>
<PrimaryButton
styles={{ root: { width: "fit-content" } }}
text="Change"
onClick={startPartitionkeyChangeWorkflow}
disabled={isCurrentJobInProgress(portalDataTransferJob)}
/>
{configContext.platform !== Platform.Emulator && (
<PrimaryButton
styles={{ root: { width: "fit-content" } }}
text="Change"
onClick={startPartitionkeyChangeWorkflow}
disabled={isCurrentJobInProgress(portalDataTransferJob)}
/>
)}
{portalDataTransferJob && (
<Stack>
<Text styles={textHeadingStyle}>{partitionKeyName} change job</Text>

View File

@@ -0,0 +1,177 @@
import "@testing-library/jest-dom";
import { fireEvent, render, screen } from "@testing-library/react";
import React from "react";
import { ThroughputBucketsComponent } from "./ThroughputBucketsComponent";
describe("ThroughputBucketsComponent", () => {
const mockOnBucketsChange = jest.fn();
const mockOnSaveableChange = jest.fn();
const defaultProps = {
currentBuckets: [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
],
throughputBucketsBaseline: [
{ id: 1, maxThroughputPercentage: 40 },
{ id: 2, maxThroughputPercentage: 50 },
],
onBucketsChange: mockOnBucketsChange,
onSaveableChange: mockOnSaveableChange,
};
beforeEach(() => {
jest.clearAllMocks();
});
it("renders the correct number of buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
});
it("renders buckets in the correct order even if input is unordered", () => {
const unorderedBuckets = [
{ id: 2, maxThroughputPercentage: 60 },
{ id: 1, maxThroughputPercentage: 50 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={unorderedBuckets} />);
const bucketLabels = screen.getAllByText(/Group \d+/).map((el) => el.textContent);
expect(bucketLabels).toEqual(["Group 1 (Data Explorer Query Bucket)", "Group 2", "Group 3", "Group 4", "Group 5"]);
});
it("renders all provided buckets even if they exceed the max default bucket count", () => {
const oversizedBuckets = [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 70 },
{ id: 4, maxThroughputPercentage: 80 },
{ id: 5, maxThroughputPercentage: 90 },
{ id: 6, maxThroughputPercentage: 100 },
{ id: 7, maxThroughputPercentage: 40 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={oversizedBuckets} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(7);
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
expect(screen.getByDisplayValue("60")).toBeInTheDocument();
expect(screen.getByDisplayValue("70")).toBeInTheDocument();
expect(screen.getByDisplayValue("80")).toBeInTheDocument();
expect(screen.getByDisplayValue("90")).toBeInTheDocument();
expect(screen.getByDisplayValue("100")).toBeInTheDocument();
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
});
it("calls onBucketsChange when a bucket value changes", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "70" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("triggers onSaveableChange when values change", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "80" } });
expect(mockOnSaveableChange).toHaveBeenCalledWith(true);
});
it("updates state consistently after multiple changes to different buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
const input2 = screen.getByDisplayValue("60");
fireEvent.change(input2, { target: { value: "80" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 80 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("resets to baseline when currentBuckets are reset", () => {
const { rerender } = render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
rerender(<ThroughputBucketsComponent {...defaultProps} currentBuckets={defaultProps.throughputBucketsBaseline} />);
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
});
it("does not call onBucketsChange when value remains unchanged", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "50" } });
expect(mockOnBucketsChange).not.toHaveBeenCalled();
});
it("disables input and slider when maxThroughputPercentage is 100", () => {
render(
<ThroughputBucketsComponent
{...defaultProps}
currentBuckets={[
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 50 },
]}
/>,
);
const disabledInputs = screen.getAllByDisplayValue("100");
expect(disabledInputs.length).toBeGreaterThan(0);
expect(disabledInputs[0]).toBeDisabled();
const sliders = screen.getAllByRole("slider");
expect(sliders.length).toBeGreaterThan(0);
expect(sliders[0]).toHaveAttribute("aria-disabled", "true");
expect(sliders[1]).toHaveAttribute("aria-disabled", "false");
});
it("toggles bucket value between 50 and 100 with switch", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const toggles = screen.getAllByRole("switch");
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("ensures default buckets are used when no buckets are provided", () => {
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={[]} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
expect(screen.getAllByDisplayValue("100")).toHaveLength(5);
});
});

View File

@@ -0,0 +1,105 @@
import { Label, Slider, Stack, TextField, Toggle } from "@fluentui/react";
import { ThroughputBucket } from "Contracts/DataModels";
import React, { FC, useEffect, useState } from "react";
import { isDirty } from "../../SettingsUtils";
const MAX_BUCKET_SIZES = 5;
const DEFAULT_BUCKETS = Array.from({ length: MAX_BUCKET_SIZES }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
export interface ThroughputBucketsComponentProps {
currentBuckets: ThroughputBucket[];
throughputBucketsBaseline: ThroughputBucket[];
onBucketsChange: (updatedBuckets: ThroughputBucket[]) => void;
onSaveableChange: (isSaveable: boolean) => void;
}
export const ThroughputBucketsComponent: FC<ThroughputBucketsComponentProps> = ({
currentBuckets,
throughputBucketsBaseline,
onBucketsChange,
onSaveableChange,
}) => {
const getThroughputBuckets = (buckets: ThroughputBucket[]): ThroughputBucket[] => {
if (!buckets || buckets.length === 0) {
return DEFAULT_BUCKETS;
}
const maxBuckets = Math.max(DEFAULT_BUCKETS.length, buckets.length);
const adjustedDefaultBuckets = Array.from({ length: maxBuckets }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
return adjustedDefaultBuckets.map(
(defaultBucket) => buckets?.find((bucket) => bucket.id === defaultBucket.id) || defaultBucket,
);
};
const [throughputBuckets, setThroughputBuckets] = useState<ThroughputBucket[]>(getThroughputBuckets(currentBuckets));
useEffect(() => {
setThroughputBuckets(getThroughputBuckets(currentBuckets));
onSaveableChange(false);
}, [currentBuckets]);
useEffect(() => {
const isChanged = isDirty(throughputBuckets, getThroughputBuckets(throughputBucketsBaseline));
onSaveableChange(isChanged);
}, [throughputBuckets]);
const handleBucketChange = (id: number, newValue: number) => {
const updatedBuckets = throughputBuckets.map((bucket) =>
bucket.id === id ? { ...bucket, maxThroughputPercentage: newValue } : bucket,
);
setThroughputBuckets(updatedBuckets);
const settingsChanged = isDirty(updatedBuckets, throughputBuckets);
settingsChanged && onBucketsChange(updatedBuckets);
};
const onToggle = (id: number, checked: boolean) => {
handleBucketChange(id, checked ? 50 : 100);
};
return (
<Stack tokens={{ childrenGap: "m" }} styles={{ root: { width: "70%", maxWidth: 700 } }}>
<Label>Throughput Buckets</Label>
<Stack>
{throughputBuckets?.map((bucket) => (
<Stack key={bucket.id} horizontal tokens={{ childrenGap: 8 }} verticalAlign="center">
<Slider
min={1}
max={100}
step={1}
value={bucket.maxThroughputPercentage}
onChange={(newValue) => handleBucketChange(bucket.id, newValue)}
showValue={false}
label={`Group ${bucket.id}${bucket.id === 1 ? " (Data Explorer Query Bucket)" : ""}`}
styles={{ root: { flex: 2, maxWidth: 400 } }}
disabled={bucket.maxThroughputPercentage === 100}
/>
<TextField
value={bucket.maxThroughputPercentage.toString()}
onChange={(event, newValue) => handleBucketChange(bucket.id, parseInt(newValue || "0", 10))}
type="number"
suffix="%"
styles={{
fieldGroup: { width: 80 },
}}
disabled={bucket.maxThroughputPercentage === 100}
/>
<Toggle
onText="Active"
offText="Inactive"
checked={bucket.maxThroughputPercentage !== 100}
onChange={(event, checked) => onToggle(bucket.id, checked)}
styles={{ root: { marginBottom: 0 }, text: { fontSize: 12 } }}
></Toggle>
</Stack>
))}
</Stack>
</Stack>
);
};

View File

@@ -17,14 +17,13 @@ import {
} from "@fluentui/react";
import React from "react";
import * as DataModels from "../../../../../Contracts/DataModels";
import { SubscriptionType } from "../../../../../Contracts/SubscriptionType";
import * as SharedConstants from "../../../../../Shared/Constants";
import { Action, ActionModifiers } from "../../../../../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../../../../../Shared/Telemetry/TelemetryProcessor";
import { userContext } from "../../../../../UserContext";
import * as AutoPilotUtils from "../../../../../Utils/AutoPilotUtils";
import { autoPilotThroughput1K } from "../../../../../Utils/AutoPilotUtils";
import { calculateEstimateNumber, usageInGB } from "../../../../../Utils/PricingUtils";
import { calculateEstimateNumber } from "../../../../../Utils/PricingUtils";
import { Int32 } from "../../../../Panes/Tables/Validators/EntityPropertyValidationCommon";
import {
PriceBreakdown,
@@ -366,29 +365,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
});
};
private minRUperGBSurvey = (): JSX.Element => {
const href = `https://ncv.microsoft.com/vRBTO37jmO?ctx={"AzureSubscriptionId":"${userContext.subscriptionId}","CosmosDBAccountName":"${userContext.databaseAccount?.name}"}`;
const oneTBinKB = 1000000000;
const minRUperGB = 10;
const featureFlagEnabled = userContext.features.showMinRUSurvey;
const collectionIsEligible =
userContext.subscriptionType !== SubscriptionType.Internal &&
this.props.usageSizeInKB > oneTBinKB &&
this.props.minimum >= usageInGB(this.props.usageSizeInKB) * minRUperGB;
if (featureFlagEnabled || collectionIsEligible) {
return (
<Text>
Need to scale below {this.props.minimum} RU/s? Reach out by filling{" "}
<a target="_blank" rel="noreferrer" href={href}>
this questionnaire
</a>
.
</Text>
);
}
return undefined;
};
private renderThroughputModeChoices = (): JSX.Element => {
const labelId = "settingsV2RadioButtonLabelId";
return (
@@ -661,7 +637,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
</Link>
</Text>
)}
{this.minRUperGBSurvey()}
{this.props.spendAckVisible && (
<Checkbox
id="spendAckCheckBox"

View File

@@ -11,7 +11,8 @@ export type isDirtyTypes =
| DataModels.IndexingPolicy
| DataModels.ComputedProperties
| DataModels.VectorEmbedding[]
| DataModels.FullTextPolicy;
| DataModels.FullTextPolicy
| DataModels.ThroughputBucket[];
export const TtlOff = "off";
export const TtlOn = "on";
export const TtlOnNoDefault = "on-nodefault";
@@ -55,6 +56,7 @@ export enum SettingsV2TabTypes {
PartitionKeyTab,
ComputedPropertiesTab,
ContainerVectorPolicyTab,
ThroughputBucketsTab,
}
export enum ContainerPolicyTabTypes {
@@ -167,6 +169,8 @@ export const getTabTitle = (tab: SettingsV2TabTypes): string => {
return "Computed Properties";
case SettingsV2TabTypes.ContainerVectorPolicyTab:
return "Container Policies";
case SettingsV2TabTypes.ThroughputBucketsTab:
return "Throughput Buckets";
default:
throw new Error(`Unknown tab ${tab}`);
}

View File

@@ -1,4 +1,5 @@
import { Checkbox, DirectionalHint, Link, Stack, Text, TextField, TooltipHost } from "@fluentui/react";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { useDatabases } from "Explorer/useDatabases";
import React, { FunctionComponent, useEffect, useState } from "react";
import * as Constants from "../../../Common/Constants";
@@ -34,10 +35,15 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
setIsThroughputCapExceeded,
onCostAcknowledgeChange,
}: ThroughputInputProps) => {
const defaultThroughput: number =
isFreeTier ||
isQuickstart ||
[Constants.WorkloadType.Learning, Constants.WorkloadType.DevelopmentTesting].includes(getWorkloadType())
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
const [isAutoscaleSelected, setIsAutoScaleSelected] = useState<boolean>(true);
const [throughput, setThroughput] = useState<number>(
isFreeTier || isQuickstart ? AutoPilotUtils.autoPilotThroughput1K : AutoPilotUtils.autoPilotThroughput4K,
);
const [throughput, setThroughput] = useState<number>(defaultThroughput);
const [isCostAcknowledged, setIsCostAcknowledged] = useState<boolean>(false);
const [throughputError, setThroughputError] = useState<string>("");
const [totalThroughputUsed, setTotalThroughputUsed] = useState<number>(0);
@@ -47,7 +53,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const throughputCap = userContext.databaseAccount?.properties.capacity?.totalThroughputLimit;
const numberOfRegions = userContext.databaseAccount?.properties.locations?.length || 1;
useEffect(() => {
// throughput cap check for the initial state
let totalThroughput = 0;
@@ -157,9 +162,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const handleOnChangeMode = (event: React.ChangeEvent<HTMLInputElement>, mode: string): void => {
if (mode === "Autoscale") {
const defaultThroughput = isFreeTier
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
setThroughput(defaultThroughput);
setIsAutoScaleSelected(true);
setThroughputValue(defaultThroughput);

View File

@@ -35,7 +35,7 @@ import { PhoenixClient } from "../Phoenix/PhoenixClient";
import * as ExplorerSettings from "../Shared/ExplorerSettings";
import { Action, ActionModifiers } from "../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../Shared/Telemetry/TelemetryProcessor";
import { isAccountNewerThanThresholdInMs, updateUserContext, userContext } from "../UserContext";
import { updateUserContext, userContext } from "../UserContext";
import { getCollectionName, getUploadName } from "../Utils/APITypeUtils";
import { stringToBlob } from "../Utils/BlobUtils";
import { isCapabilityEnabled } from "../Utils/CapabilityUtils";
@@ -278,37 +278,6 @@ export default class Explorer {
}
}
public openNPSSurveyDialog(): void {
if (!Platform.Portal || !["Postgres", "SQL", "Mongo"].includes(userContext.apiType)) {
return;
}
const ONE_DAY_IN_MS = 86400000;
const SEVEN_DAYS_IN_MS = 604800000;
// Try Cosmos DB subscription - survey shown to 100% of users at day 1 in Data Explorer.
if (userContext.isTryCosmosDBSubscription) {
if (isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", ONE_DAY_IN_MS)) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed in Try Cosmos DB ${userContext.apiType}`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
} else {
// Show survey when an existing account is older than 7 days
if (
!isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", SEVEN_DAYS_IN_MS)
) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed for existing ${userContext.apiType} account older than 7 days`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
}
}
public async openCESCVAFeedbackBlade(): Promise<void> {
sendMessage({ type: MessageTypes.OpenCESCVAFeedbackBlade });
Logger.logInfo(
@@ -1158,7 +1127,7 @@ export default class Explorer {
await this.initNotebooks(userContext.databaseAccount);
}
await this.refreshSampleData();
this.refreshSampleData();
}
public async configureCopilot(): Promise<void> {
@@ -1183,26 +1152,27 @@ export default class Explorer {
.setCopilotSampleDBEnabled(copilotEnabled && copilotUserDBEnabled && copilotSampleDBEnabled);
}
public async refreshSampleData(): Promise<void> {
try {
if (!userContext.sampleDataConnectionInfo) {
return;
}
const collection: DataModels.Collection = await readSampleCollection();
if (!collection) {
return;
}
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
} catch (error) {
Logger.logError(getErrorMessage(error), "Explorer");
public refreshSampleData(): void {
if (!userContext.sampleDataConnectionInfo) {
return;
}
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
readSampleCollection()
.then((collection: DataModels.Collection) => {
if (!collection) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
})
.catch((error) => {
Logger.logError(getErrorMessage(error), "Explorer/refreshSampleData");
});
}
}

View File

@@ -14,10 +14,6 @@
.flex-direction(@direction: row);
padding: 4px 5px;
label {
padding: 0px;
}
.valueCol {
flex-grow: 1;
padding-right: 5px;
@@ -63,6 +59,10 @@
height: 100%;
}
.customTrashIcon {
padding-top: 33px;
}
.rightPaneTrashIconImg {
vertical-align: top;
}

View File

@@ -142,10 +142,11 @@ export const NewVertexComponent: FunctionComponent<INewVertexComponentProps> = (
<div className="labelCol">
<TextField
className="edgeInput"
label={index === 0 && "Key"}
type="text"
id="propertyKeyNewVertexPane"
componentRef={input}
aria-required="true"
required
placeholder="Key"
autoComplete="off"
aria-label={`Enter value for propery ${index + 1}`}
@@ -153,11 +154,11 @@ export const NewVertexComponent: FunctionComponent<INewVertexComponentProps> = (
onChange={(event: React.ChangeEvent<HTMLInputElement>) => onKeyChange(event, index)}
/>
</div>
<span className="mandatoryStar">*&nbsp;</span>
<div className="valueCol">
<TextField
className="edgeInput"
label={index === 0 && "Value"}
type="text"
placeholder="Value"
autoComplete="off"
@@ -169,6 +170,8 @@ export const NewVertexComponent: FunctionComponent<INewVertexComponentProps> = (
<div>
<Dropdown
role="combobox"
label={index === 0 && "Type"}
ariaLabel="Type"
placeholder="Select an option"
defaultSelectedKey={data.values[0].type}
style={{ width: 100 }}
@@ -181,7 +184,7 @@ export const NewVertexComponent: FunctionComponent<INewVertexComponentProps> = (
</div>
<div className="actionCol">
<div
className="rightPaneTrashIcon rightPaneBtns"
className={`rightPaneTrashIcon rightPaneBtns ${index === 0 && "customTrashIcon"}`}
tabIndex={0}
role="button"
aria-label={`Delete ${data.key}`}

View File

@@ -37,21 +37,25 @@ describe("CommandBarComponentButtonFactory tests", () => {
expect(enableAzureSynapseLinkBtn).toBeDefined();
});
it("Button should not be visible for Tables API", () => {
updateUserContext({
databaseAccount: {
properties: {
capabilities: [{ name: "EnableTable" }],
},
} as DatabaseAccount,
});
const buttons = CommandBarComponentButtonFactory.createStaticCommandBarButtons(mockExplorer, selectedNodeState);
const enableAzureSynapseLinkBtn = buttons.find(
(button) => button.commandButtonLabel === enableAzureSynapseLinkBtnLabel,
);
expect(enableAzureSynapseLinkBtn).toBeUndefined();
});
// TODO: Now that Tables API supports dataplane RBAC, calling createStaticCommandBarButtons will enable the
// Entra ID Login button, which causes this test to fail due to "Invalid hook call.". This seems to be
// unsupported in jest and needs to be tested with react-hooks-testing-library.
//
// it("Button should not be visible for Tables API", () => {
// updateUserContext({
// databaseAccount: {
// properties: {
// capabilities: [{ name: "EnableTable" }],
// },
// } as DatabaseAccount,
// });
//
// const buttons = CommandBarComponentButtonFactory.createStaticCommandBarButtons(mockExplorer, selectedNodeState);
// const enableAzureSynapseLinkBtn = buttons.find(
// (button) => button.commandButtonLabel === enableAzureSynapseLinkBtnLabel,
// );
// expect(enableAzureSynapseLinkBtn).toBeUndefined();
//});
it("Button should not be visible for Cassandra API", () => {
updateUserContext({

View File

@@ -1,4 +1,5 @@
import { KeyboardAction } from "KeyboardShortcuts";
import { isDataplaneRbacSupported } from "Utils/APITypeUtils";
import * as React from "react";
import { useEffect, useState } from "react";
import AddSqlQueryIcon from "../../../../images/AddSqlQuery_16x16.svg";
@@ -61,7 +62,7 @@ export function createStaticCommandBarButtons(
}
}
if (userContext.apiType === "SQL") {
if (isDataplaneRbacSupported(userContext.apiType)) {
const [loginButtonProps, setLoginButtonProps] = useState<CommandButtonComponentProps | undefined>(undefined);
const dataPlaneRbacEnabled = useDataPlaneRbac((state) => state.dataPlaneRbacEnabled);
const aadTokenUpdated = useDataPlaneRbac((state) => state.aadTokenUpdated);

View File

@@ -109,15 +109,15 @@ export class NotificationConsoleComponent extends React.Component<
<div className="statusBar">
<span className="dataTypeIcons">
<span className="notificationConsoleHeaderIconWithData">
<img src={LoadingIcon} alt="in progress items" />
<img src={LoadingIcon} alt="In progress items" />
<span className="numInProgress">{numInProgress}</span>
</span>
<span className="notificationConsoleHeaderIconWithData">
<img src={ErrorBlackIcon} alt="error items" />
<img src={ErrorBlackIcon} alt="Error items" />
<span className="numErroredItems">{numErroredItems}</span>
</span>
<span className="notificationConsoleHeaderIconWithData">
<img src={infoBubbleIcon} alt="info items" />
<img src={infoBubbleIcon} alt="Info items" />
<span className="numInfoItems">{numInfoItems}</span>
</span>
</span>
@@ -134,12 +134,12 @@ export class NotificationConsoleComponent extends React.Component<
data-test="NotificationConsole/ExpandCollapseButton"
role="button"
tabIndex={0}
aria-label={"console button" + (this.props.isConsoleExpanded ? " expanded" : " collapsed")}
aria-expanded={!this.props.isConsoleExpanded}
aria-label="Console"
aria-expanded={this.props.isConsoleExpanded}
>
<img
src={this.props.isConsoleExpanded ? ChevronDownIcon : ChevronUpIcon}
alt={this.props.isConsoleExpanded ? "ChevronDownIcon" : "ChevronUpIcon"}
alt={this.props.isConsoleExpanded ? "Collapse icon" : "Expand icon"}
/>
</div>
</div>

View File

@@ -21,7 +21,7 @@ exports[`NotificationConsoleComponent renders the console 1`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="in progress items"
alt="In progress items"
src={{}}
/>
<span
@@ -34,7 +34,7 @@ exports[`NotificationConsoleComponent renders the console 1`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="error items"
alt="Error items"
src={{}}
/>
<span
@@ -47,7 +47,7 @@ exports[`NotificationConsoleComponent renders the console 1`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="info items"
alt="Info items"
src={{}}
/>
<span
@@ -71,15 +71,15 @@ exports[`NotificationConsoleComponent renders the console 1`] = `
</span>
</div>
<div
aria-expanded={true}
aria-label="console button collapsed"
aria-expanded={false}
aria-label="Console"
className="expandCollapseButton"
data-test="NotificationConsole/ExpandCollapseButton"
role="button"
tabIndex={0}
>
<img
alt="ChevronUpIcon"
alt="Expand icon"
src=""
/>
</div>
@@ -192,7 +192,7 @@ exports[`NotificationConsoleComponent renders the console 2`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="in progress items"
alt="In progress items"
src={{}}
/>
<span
@@ -205,7 +205,7 @@ exports[`NotificationConsoleComponent renders the console 2`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="error items"
alt="Error items"
src={{}}
/>
<span
@@ -218,7 +218,7 @@ exports[`NotificationConsoleComponent renders the console 2`] = `
className="notificationConsoleHeaderIconWithData"
>
<img
alt="info items"
alt="Info items"
src={{}}
/>
<span
@@ -244,15 +244,15 @@ exports[`NotificationConsoleComponent renders the console 2`] = `
</span>
</div>
<div
aria-expanded={true}
aria-label="console button collapsed"
aria-expanded={false}
aria-label="Console"
className="expandCollapseButton"
data-test="NotificationConsole/ExpandCollapseButton"
role="button"
tabIndex={0}
>
<img
alt="ChevronUpIcon"
alt="Expand icon"
src=""
/>
</div>

View File

@@ -1,6 +1,5 @@
import { isPublicInternetAccessAllowed } from "Common/DatabaseAccountUtility";
import { PhoenixClient } from "Phoenix/PhoenixClient";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { cloneDeep } from "lodash";
import create, { UseStore } from "zustand";
import { AuthType } from "../../AuthType";
@@ -128,9 +127,7 @@ export const useNotebook: UseStore<NotebookState> = create((set, get) => ({
userContext.apiType === "Postgres" || userContext.apiType === "VCoreMongo"
? databaseAccount?.location
: databaseAccount?.properties?.writeLocations?.[0]?.locationName.toLowerCase();
const disallowedLocationsUri: string = useNewPortalBackendEndpoint(Constants.BackendApi.DisallowedLocations)
? `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`
: `${configContext.BACKEND_ENDPOINT}/api/disallowedLocations`;
const disallowedLocationsUri: string = `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`;
const authorizationHeader = getAuthorizationHeader();
try {
const response = await fetch(disallowedLocationsUri, {

View File

@@ -865,6 +865,7 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
<Link
href="https://aka.ms/cosmosdb-synapselink"
target="_blank"
aria-label={Constants.ariaLabelForLearnMoreLink.AzureSynapseLink}
className="capacitycalculator-link"
>
Learn more
@@ -1222,7 +1223,11 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
<Text variant="small">
Enable analytical store capability to perform near real-time analytics on your operational data, without
impacting the performance of transactional workloads.{" "}
<Link target="_blank" href="https://aka.ms/analytical-store-overview">
<Link
aria-label={Constants.ariaLabelForLearnMoreLink.AnalyticalStore}
target="_blank"
href="https://aka.ms/analytical-store-overview"
>
Learn more
</Link>
</Text>

View File

@@ -94,6 +94,7 @@
padding-left: @MediumSpace;
.paneErrorLink {
color: @LinkColor;
cursor: pointer;
font-size: @mediumFontSize;
}

View File

@@ -32,6 +32,7 @@ import {
} from "Shared/StorageUtility";
import * as StringUtility from "Shared/StringUtility";
import { updateUserContext, userContext } from "UserContext";
import { isDataplaneRbacSupported } from "Utils/APITypeUtils";
import { acquireMsalTokenForAccount } from "Utils/AuthorizationUtils";
import { logConsoleError, logConsoleInfo } from "Utils/NotificationConsoleUtils";
import * as PriorityBasedExecutionUtils from "Utils/PriorityBasedExecutionUtils";
@@ -174,15 +175,26 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
const styles = useStyles();
const explorerVersion = configContext.gitSha;
const isEmulator = configContext.platform === Platform.Emulator;
const shouldShowQueryPageOptions = userContext.apiType === "SQL";
const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin";
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin";
const shouldShowParallelismOption = userContext.apiType !== "Gremlin";
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled();
const showRetrySettings =
(userContext.apiType === "SQL" || userContext.apiType === "Tables" || userContext.apiType === "Gremlin") &&
!isEmulator;
const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin" && !isEmulator;
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin" && !isEmulator;
const shouldShowParallelismOption = userContext.apiType !== "Gremlin" && !isEmulator;
const showEnableEntraIdRbac =
isDataplaneRbacSupported(userContext.apiType) &&
userContext.authType === AuthType.AAD &&
configContext.platform !== Platform.Fabric &&
!isEmulator;
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled() && !isEmulator;
const shouldShowCopilotSampleDBOption =
userContext.apiType === "SQL" &&
useQueryCopilot.getState().copilotEnabled &&
useDatabases.getState().sampleDataResourceTokenCollection;
useDatabases.getState().sampleDataResourceTokenCollection &&
!isEmulator;
const handlerOnSubmit = async () => {
setIsExecuting(true);
@@ -491,7 +503,7 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
return (
<RightPaneForm {...genericPaneProps}>
<div className={`paneMainContent ${styles.container}`}>
<Accordion className={styles.firstItem}>
<Accordion className={`customAccordion ${styles.firstItem}`}>
{shouldShowQueryPageOptions && (
<AccordionItem value="1">
<AccordionHeader>
@@ -541,39 +553,37 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" &&
userContext.authType === AuthType.AAD &&
configContext.platform !== Platform.Fabric && (
<AccordionItem value="2">
<AccordionHeader>
<div className={styles.header}>Enable Entra ID RBAC</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra
ID RBAC.
<a
href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer"
target="_blank"
rel="noopener noreferrer"
>
{" "}
Learn more{" "}
</a>
</div>
<ChoiceGroup
ariaLabelledBy="enableDataPlaneRBACOptions"
options={dataPlaneRBACOptionsList}
styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
{showEnableEntraIdRbac && (
<AccordionItem value="2">
<AccordionHeader>
<div className={styles.header}>Enable Entra ID RBAC</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra ID
RBAC.
<a
href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer"
target="_blank"
rel="noopener noreferrer"
>
{" "}
Learn more{" "}
</a>
</div>
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" && (
<ChoiceGroup
ariaLabelledBy="enableDataPlaneRBACOptions"
options={dataPlaneRBACOptionsList}
styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
</div>
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" && !isEmulator && (
<>
<AccordionItem value="3">
<AccordionHeader>
@@ -671,7 +681,7 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionItem>
</>
)}
{(userContext.apiType === "SQL" || userContext.apiType === "Tables" || userContext.apiType === "Gremlin") && (
{showRetrySettings && (
<AccordionItem value="6">
<AccordionHeader>
<div className={styles.header}>Retry Settings</div>
@@ -744,29 +754,30 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
<AccordionItem value="7">
<AccordionHeader>
<div className={styles.header}>Enable container pagination</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order.
{!isEmulator && (
<AccordionItem value="7">
<AccordionHeader>
<div className={styles.header}>Enable container pagination</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order.
</div>
<Checkbox
styles={{
label: { padding: 0 },
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div>
<Checkbox
styles={{
label: { padding: 0 },
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div>
</AccordionPanel>
</AccordionItem>
</AccordionPanel>
</AccordionItem>
)}
{shouldShowCrossPartitionOption && (
<AccordionItem value="8">
<AccordionHeader>

View File

@@ -11,7 +11,7 @@ exports[`Settings Pane should render Default properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
>
<Accordion
className="___1uf6361_0000000 fz7g6wx"
className="customAccordion ___1uf6361_0000000 fz7g6wx"
>
<AccordionItem
value="1"
@@ -572,7 +572,7 @@ exports[`Settings Pane should render Gremlin properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
>
<Accordion
className="___1uf6361_0000000 fz7g6wx"
className="customAccordion ___1uf6361_0000000 fz7g6wx"
>
<AccordionItem
value="6"

View File

@@ -319,6 +319,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads.
<StyledLinkBase
aria-label="Learn more about analytical store."
href="https://aka.ms/analytical-store-overview"
target="_blank"
>
@@ -383,6 +384,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
. Enable Synapse Link for this Cosmos DB account.
<StyledLinkBase
aria-label="Learn more about Azure Synapse Link."
className="capacitycalculator-link"
href="https://aka.ms/cosmosdb-synapselink"
target="_blank"

View File

@@ -75,6 +75,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
const inputEdited = useRef(false);
const itemRefs = useRef([]);
const searchInputRef = useRef(null);
const copyQueryRef = useRef(null);
const {
openFeedbackModal,
hideFeedbackModalForLikedQueries,
@@ -132,6 +133,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
document.body.removeChild(queryElement);
setshowCopyPopup(true);
copyQueryRef.current.focus();
setTimeout(() => {
setshowCopyPopup(false);
}, 6000);
@@ -305,7 +307,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
if (isGeneratingQuery === null) {
return " ";
} else if (isGeneratingQuery) {
return "Content is loading";
return "Thinking";
} else {
return "Content is updated";
}
@@ -400,6 +402,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
<IconButton
iconProps={{ iconName: "Send" }}
disabled={isGeneratingQuery || !userPrompt.trim()}
allowDisabledFocus={true}
style={{ background: "none" }}
onClick={() => startGenerateQueryProcess()}
aria-label="Send"
@@ -676,6 +679,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)}
<CommandBarButton
className="copyQuery"
elementRef={copyQueryRef}
onClick={copyGeneratedCode}
iconProps={{ iconName: "Copy" }}
style={{ fontSize: 12, transition: "background-color 0.3s ease", height: "100%" }}
@@ -706,6 +710,9 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)}
</Stack>
)}
{(showFeedbackBar || isGeneratingQuery) && (
<span role="alert" className="screenReaderOnly" aria-label={getAriaLabel()} />
)}
{isGeneratingQuery && (
<ProgressIndicator
label="Thinking..."

View File

@@ -1,7 +1,6 @@
import { FeedOptions } from "@azure/cosmos";
import {
Areas,
BackendApi,
ConnectionStatusType,
ContainerStatusType,
HttpStatusCodes,
@@ -32,7 +31,6 @@ import { Action } from "Shared/Telemetry/TelemetryConstants";
import { traceFailure, traceStart, traceSuccess } from "Shared/Telemetry/TelemetryProcessor";
import { userContext } from "UserContext";
import { getAuthorizationHeader } from "Utils/AuthorizationUtils";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { queryPagesUntilContentPresent } from "Utils/QueryUtils";
import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot";
import { useTabs } from "hooks/useTabs";
@@ -82,9 +80,7 @@ export const isCopilotFeatureRegistered = async (subscriptionId: string): Promis
};
export const getCopilotEnabled = async (): Promise<boolean> => {
const backendEndpoint: string = useNewPortalBackendEndpoint(BackendApi.PortalSettings)
? configContext.PORTAL_BACKEND_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const backendEndpoint: string = configContext.PORTAL_BACKEND_ENDPOINT;
const url = `${backendEndpoint}/api/portalsettings/querycopilot`;
const authorizationHeader: AuthorizationTokenHeaderMetadata = getAuthorizationHeader();

View File

@@ -26,7 +26,7 @@ import { getCollectionName, getDatabaseName } from "Utils/APITypeUtils";
import { Allotment, AllotmentHandle } from "allotment";
import { useSidePanel } from "hooks/useSidePanel";
import { debounce } from "lodash";
import React, { useCallback, useEffect, useMemo, useRef, useState } from "react";
import React, { useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState } from "react";
const useSidebarStyles = makeStyles({
sidebar: {
@@ -109,6 +109,7 @@ interface GlobalCommand {
icon: JSX.Element;
onClick: () => void;
keyboardAction?: KeyboardAction;
ref?: React.RefObject<HTMLButtonElement>;
}
const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
@@ -118,6 +119,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
// However, that messes with the Menu positioning, so we need to get a reference to the 'div' to pass to the Menu.
// We can't use a ref though, because it would be set after the Menu is rendered, so we use a state value to force a re-render.
const [globalCommandButton, setGlobalCommandButton] = useState<HTMLElement | null>(null);
const primaryFocusableRef = useRef<HTMLButtonElement>(null);
const actions = useMemo<GlobalCommand[]>(() => {
if (
@@ -177,6 +179,16 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
);
}, [actions, setKeyboardActions]);
useLayoutEffect(() => {
if (primaryFocusableRef.current) {
const timer = setTimeout(() => {
primaryFocusableRef.current.focus();
}, 0);
return () => clearTimeout(timer);
}
return undefined;
}, []);
if (!primaryAction) {
return null;
}
@@ -184,7 +196,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
return (
<div className={styles.globalCommandsContainer} data-test="GlobalCommands">
{actions.length === 1 ? (
<Button icon={primaryAction.icon} onClick={onPrimaryActionClick}>
<Button icon={primaryAction.icon} onClick={onPrimaryActionClick} ref={primaryFocusableRef}>
{primaryAction.label}
</Button>
) : (
@@ -194,7 +206,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
<div ref={setGlobalCommandButton}>
<SplitButton
menuButton={{ ...triggerProps, "aria-label": "More commands" }}
primaryActionButton={{ onClick: onPrimaryActionClick }}
primaryActionButton={{ onClick: onPrimaryActionClick, ref: primaryFocusableRef }}
className={styles.globalCommandsSplitButton}
icon={primaryAction.icon}
>

View File

@@ -39,7 +39,7 @@ export const SplashScreenButton: React.FC<SplashScreenButtonProps> = ({
role="button"
>
<div>
<img src={imgSrc} />
<img src={imgSrc} alt={title} aria-hidden="true" />
</div>
<Stack style={{ marginLeft: 16 }}>
<Text style={{ fontSize: 18, fontWeight: 600 }}>{title}</Text>

View File

@@ -3,7 +3,7 @@ import * as ko from "knockout";
import Q from "q";
import { AuthType } from "../../AuthType";
import * as Constants from "../../Common/Constants";
import { CassandraProxyAPIs, CassandraProxyEndpoints } from "../../Common/Constants";
import { CassandraProxyAPIs } from "../../Common/Constants";
import { handleError } from "../../Common/ErrorHandlingUtils";
import * as HeadersUtility from "../../Common/HeadersUtility";
import { createDocument } from "../../Common/dataAccess/createDocument";
@@ -264,9 +264,6 @@ export class CassandraAPIDataClient extends TableDataClient {
shouldNotify?: boolean,
paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> {
if (!this.useCassandraProxyEndpoint("postQuery")) {
return this.queryDocuments_ToBeDeprecated(collection, query, shouldNotify, paginationToken);
}
const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try {
@@ -309,55 +306,6 @@ export class CassandraAPIDataClient extends TableDataClient {
}
}
public async queryDocuments_ToBeDeprecated(
collection: ViewModels.Collection,
query: string,
shouldNotify?: boolean,
paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> {
const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try {
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestQueryApi
: Constants.CassandraBackend.queryApi;
const data: any = await $.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
query,
paginationToken,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
});
shouldNotify &&
NotificationConsoleUtils.logConsoleInfo(
`Successfully fetched ${data.result.length} rows for table ${collection.id()}`,
);
return {
Results: data.result,
ContinuationToken: data.paginationToken,
};
} catch (error) {
shouldNotify &&
handleError(
error,
"QueryDocuments_ToBeDeprecated_Cassandra",
`Failed to query rows for table ${collection.id()}`,
);
throw error;
} finally {
clearMessage?.();
}
}
public async deleteDocuments(
collection: ViewModels.Collection,
entitiesToDelete: Entities.ITableEntity[],
@@ -471,10 +419,6 @@ export class CassandraAPIDataClient extends TableDataClient {
}
public getTableKeys(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!this.useCassandraProxyEndpoint("getKeys")) {
return this.getTableKeys_ToBeDeprecated(collection);
}
if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys);
}
@@ -515,52 +459,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
public getTableKeys_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys);
}
const clearInProgressMessage = logConsoleProgress(`Fetching keys for table ${collection.id()}`);
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestKeysApi
: Constants.CassandraBackend.keysApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKeys>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: CassandraTableKeys) => {
collection.cassandraKeys = data;
logConsoleInfo(`Successfully fetched keys for table ${collection.id()}`);
deferred.resolve(data);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchKeysCassandra", `Error fetching keys for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
public getTableSchema(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!this.useCassandraProxyEndpoint("getSchema")) {
return this.getTableSchema_ToBeDeprecated(collection);
}
if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema);
}
@@ -602,52 +501,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
public getTableSchema_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema);
}
const clearInProgressMessage = logConsoleProgress(`Fetching schema for table ${collection.id()}`);
const { databaseAccount, authType } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestSchemaApi
: Constants.CassandraBackend.schemaApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKey[]>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: any) => {
collection.cassandraSchema = data.columns;
logConsoleInfo(`Successfully fetched schema for table ${collection.id()}`);
deferred.resolve(data.columns);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchSchemaCassandra", `Error fetching schema for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
private createOrDeleteQuery(cassandraEndpoint: string, resourceId: string, query: string): Q.Promise<any> {
if (!this.useCassandraProxyEndpoint("createOrDelete")) {
return this.createOrDeleteQuery_ToBeDeprecated(cassandraEndpoint, resourceId, query);
}
const deferred = Q.defer();
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
@@ -677,38 +531,6 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
private createOrDeleteQuery_ToBeDeprecated(
cassandraEndpoint: string,
resourceId: string,
query: string,
): Q.Promise<any> {
const deferred = Q.defer();
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestCreateOrDeleteApi
: Constants.CassandraBackend.createOrDeleteApi;
$.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(cassandraEndpoint),
resourceId: resourceId,
query: query,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
}).then(
(data: any) => {
deferred.resolve();
},
(reason) => {
deferred.reject(reason);
},
);
return deferred.promise;
}
private trimCassandraEndpoint(cassandraEndpoint: string): string {
if (!cassandraEndpoint) {
return cassandraEndpoint;
@@ -747,23 +569,4 @@ export class CassandraAPIDataClient extends TableDataClient {
private getCassandraPartitionKeyProperty(collection: ViewModels.Collection): string {
return collection.cassandraKeys.partitionKeys[0].property;
}
private useCassandraProxyEndpoint(api: string): boolean {
const activeCassandraProxyEndpoints: string[] = [
CassandraProxyEndpoints.Development,
CassandraProxyEndpoints.Mpac,
CassandraProxyEndpoints.Prod,
CassandraProxyEndpoints.Fairfax,
CassandraProxyEndpoints.Mooncake,
];
if (configContext.globallyEnabledCassandraAPIs.includes(api)) {
return true;
}
return (
configContext.NEW_CASSANDRA_APIS?.includes(api) &&
activeCassandraProxyEndpoints.includes(configContext.CASSANDRA_PROXY_ENDPOINT)
);
}
}

View File

@@ -1150,27 +1150,16 @@ export const DocumentsTabComponent: React.FunctionComponent<IDocumentsTabCompone
deletePromise = _bulkDeleteNoSqlDocuments(_collection, toDeleteDocumentIds);
}
} else {
if (isMongoBulkDeleteDisabled) {
// TODO: Once new mongo proxy is available for all users, remove the call for MongoProxyClient.deleteDocument().
// MongoProxyClient.deleteDocuments() should be called for all users.
deletePromise = MongoProxyClient.deleteDocument(
_collection.databaseId,
_collection as ViewModels.Collection,
toDeleteDocumentIds[0],
).then(() => [toDeleteDocumentIds[0]]);
// ----------------------------------------------------------------------------------------------------
} else {
deletePromise = MongoProxyClient.deleteDocuments(
_collection.databaseId,
_collection as ViewModels.Collection,
toDeleteDocumentIds,
).then(({ deletedCount, isAcknowledged }) => {
if (deletedCount === toDeleteDocumentIds.length && isAcknowledged) {
return toDeleteDocumentIds;
}
throw new Error(`Delete failed with deletedCount: ${deletedCount} and isAcknowledged: ${isAcknowledged}`);
});
}
deletePromise = MongoProxyClient.deleteDocuments(
_collection.databaseId,
_collection as ViewModels.Collection,
toDeleteDocumentIds,
).then(({ deletedCount, isAcknowledged }) => {
if (deletedCount === toDeleteDocumentIds.length && isAcknowledged) {
return toDeleteDocumentIds;
}
throw new Error(`Delete failed with deletedCount: ${deletedCount} and isAcknowledged: ${isAcknowledged}`);
});
}
return deletePromise
@@ -2054,11 +2043,8 @@ export const DocumentsTabComponent: React.FunctionComponent<IDocumentsTabCompone
}
}, [prevSelectedColumnIds, refreshDocumentsGrid, selectedColumnIds]);
// TODO: remove isMongoBulkDeleteDisabled when new mongo proxy is enabled for all users
// TODO: remove partitionKey.systemKey when JS SDK bug is fixed
const isMongoBulkDeleteDisabled = !MongoProxyClient.useMongoProxyEndpoint(Constants.MongoProxyApi.BulkDelete);
const isBulkDeleteDisabled =
(partitionKey.systemKey && !isPreferredApiMongoDB) || (isPreferredApiMongoDB && isMongoBulkDeleteDisabled);
const isBulkDeleteDisabled = partitionKey.systemKey && !isPreferredApiMongoDB;
// -------------------------------------------------------
const getFilterChoices = (): InputDatalistDropdownOptionSection[] => {

View File

@@ -1,6 +1,4 @@
import { useMongoProxyEndpoint } from "Common/MongoProxyClient";
import React, { Component } from "react";
import * as Constants from "../../../Common/Constants";
import { configContext } from "../../../ConfigContext";
import * as ViewModels from "../../../Contracts/ViewModels";
import { Action, ActionModifiers } from "../../../Shared/Telemetry/TelemetryConstants";
@@ -50,15 +48,13 @@ export default class MongoShellTabComponent extends Component<
IMongoShellTabComponentStates
> {
private _logTraces: Map<string, number>;
private _useMongoProxyEndpoint: boolean;
constructor(props: IMongoShellTabComponentProps) {
super(props);
this._logTraces = new Map();
this._useMongoProxyEndpoint = useMongoProxyEndpoint(Constants.MongoProxyApi.LegacyMongoShell);
this.state = {
url: getMongoShellUrl(this._useMongoProxyEndpoint),
url: getMongoShellUrl(),
};
props.onMongoShellTabAccessor({
@@ -113,17 +109,9 @@ export default class MongoShellTabComponent extends Component<
const resourceId = databaseAccount?.id;
const accountName = databaseAccount?.name;
const documentEndpoint = databaseAccount?.properties.mongoEndpoint || databaseAccount?.properties.documentEndpoint;
const mongoEndpoint =
documentEndpoint.substr(
Constants.MongoDBAccounts.protocol.length + 3,
documentEndpoint.length -
(Constants.MongoDBAccounts.protocol.length + 2 + Constants.MongoDBAccounts.defaultPort.length),
) + Constants.MongoDBAccounts.defaultPort.toString();
const databaseId = this.props.collection.databaseId;
const collectionId = this.props.collection.id();
const apiEndpoint = this._useMongoProxyEndpoint
? configContext.MONGO_PROXY_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const apiEndpoint = configContext.MONGO_PROXY_ENDPOINT;
const encryptedAuthToken: string = userContext.accessToken;
shellIframe.contentWindow.postMessage(
@@ -132,7 +120,7 @@ export default class MongoShellTabComponent extends Component<
data: {
resourceId: resourceId,
accountName: accountName,
mongoEndpoint: this._useMongoProxyEndpoint ? documentEndpoint : mongoEndpoint,
mongoEndpoint: documentEndpoint,
authorization: authorization,
databaseId: databaseId,
collectionId: collectionId,

View File

@@ -2,8 +2,6 @@ import { Platform, resetConfigContext, updateConfigContext } from "../../../Conf
import { updateUserContext, userContext } from "../../../UserContext";
import { getMongoShellUrl } from "./getMongoShellUrl";
const mongoBackendEndpoint = "https://localhost:1234";
describe("getMongoShellUrl", () => {
let queryString = "";
@@ -11,7 +9,6 @@ describe("getMongoShellUrl", () => {
resetConfigContext();
updateConfigContext({
BACKEND_ENDPOINT: mongoBackendEndpoint,
platform: Platform.Hosted,
});
@@ -37,12 +34,7 @@ describe("getMongoShellUrl", () => {
queryString = `resourceId=${userContext.databaseAccount.id}&accountName=${userContext.databaseAccount.name}&mongoEndpoint=${userContext.databaseAccount.properties.documentEndpoint}`;
});
it("should return /indexv2.html by default", () => {
expect(getMongoShellUrl().toString()).toContain(`/indexv2.html?${queryString}`);
});
it("should return /index.html when useMongoProxyEndpoint is true", () => {
const useMongoProxyEndpoint: boolean = true;
expect(getMongoShellUrl(useMongoProxyEndpoint).toString()).toContain(`/index.html?${queryString}`);
it("should return /index.html by default", () => {
expect(getMongoShellUrl().toString()).toContain(`/index.html?${queryString}`);
});
});

View File

@@ -1,11 +1,11 @@
import { userContext } from "../../../UserContext";
export function getMongoShellUrl(useMongoProxyEndpoint?: boolean): string {
export function getMongoShellUrl(): string {
const { databaseAccount: account } = userContext;
const resourceId = account?.id;
const accountName = account?.name;
const mongoEndpoint = account?.properties?.mongoEndpoint || account?.properties?.documentEndpoint;
const queryString = `resourceId=${resourceId}&accountName=${accountName}&mongoEndpoint=${mongoEndpoint}`;
return useMongoProxyEndpoint ? `/mongoshell/index.html?${queryString}` : `/mongoshell/indexv2.html?${queryString}`;
return `/mongoshell/index.html?${queryString}`;
}

View File

@@ -375,6 +375,7 @@ class QueryTabComponentImpl extends React.Component<QueryTabComponentImplProps,
ruCapPerOperation: ruThreshold,
} as QueryOperationOptions;
}
const queryDocuments = async (firstItemIndex: number) =>
await queryDocumentsPage(
this.props.collection && this.props.collection.id(),

View File

@@ -1,7 +1,3 @@
import { IMessageBarStyles, MessageBar, MessageBarType } from "@fluentui/react";
import { CassandraProxyEndpoints, MongoProxyEndpoints } from "Common/Constants";
import { configContext } from "ConfigContext";
import { IpRule } from "Contracts/DataModels";
import { CollectionTabKind } from "Contracts/ViewModels";
import Explorer from "Explorer/Explorer";
import { useCommandBar } from "Explorer/Menus/CommandBar/CommandBarComponentAdapter";
@@ -12,10 +8,8 @@ import { PostgresConnectTab } from "Explorer/Tabs/PostgresConnectTab";
import { QuickstartTab } from "Explorer/Tabs/QuickstartTab";
import { VcoreMongoConnectTab } from "Explorer/Tabs/VCoreMongoConnectTab";
import { VcoreMongoQuickstartTab } from "Explorer/Tabs/VCoreMongoQuickstartTab";
import { LayoutConstants } from "Explorer/Theme/ThemeUtil";
import { KeyboardAction, KeyboardActionGroup, useKeyboardActionGroup } from "KeyboardShortcuts";
import { userContext } from "UserContext";
import { CassandraProxyOutboundIPs, MongoProxyOutboundIPs, PortalBackendIPs } from "Utils/EndpointUtils";
import { useTeachingBubble } from "hooks/useTeachingBubble";
import ko from "knockout";
import React, { MutableRefObject, useEffect, useRef, useState } from "react";
@@ -34,10 +28,6 @@ interface TabsProps {
export const Tabs = ({ explorer }: TabsProps): JSX.Element => {
const { openedTabs, openedReactTabs, activeTab, activeReactTab } = useTabs();
const [
showMongoAndCassandraProxiesNetworkSettingsWarningState,
setShowMongoAndCassandraProxiesNetworkSettingsWarningState,
] = useState<boolean>(showMongoAndCassandraProxiesNetworkSettingsWarning());
const setKeyboardHandlers = useKeyboardActionGroup(KeyboardActionGroup.TABS);
useEffect(() => {
@@ -48,28 +38,8 @@ export const Tabs = ({ explorer }: TabsProps): JSX.Element => {
});
}, [setKeyboardHandlers]);
const defaultMessageBarStyles: IMessageBarStyles = {
root: {
height: `${LayoutConstants.rowHeight}px`,
overflow: "hidden",
flexDirection: "row",
},
};
return (
<div className="tabsManagerContainer">
{showMongoAndCassandraProxiesNetworkSettingsWarningState && (
<MessageBar
messageBarType={MessageBarType.warning}
styles={defaultMessageBarStyles}
onDismiss={() => {
setShowMongoAndCassandraProxiesNetworkSettingsWarningState(false);
}}
>
{`We have migrated our middleware to new infrastructure. To avoid issues with Data Explorer access, please
re-enable "Allow access from Azure Portal" on the Networking blade for your account.`}
</MessageBar>
)}
<div className="nav-tabs-margin">
<ul className="nav nav-tabs level navTabHeight" id="navTabs" role="tablist">
{openedReactTabs.map((tab) => (
@@ -203,7 +173,7 @@ const CloseButton = ({
onKeyPress={({ nativeEvent: e }) => (tab ? tab.onKeyPressClose(undefined, e) : onKeyPressReactTabClose(e, tabKind))}
>
<span className="tabIcon close-Icon">
<img src={errorIcon} title="Close" alt="Close" role="none" />
<img src={errorIcon} title="Close" alt="Close" aria-label="hidden" />
</span>
</span>
);
@@ -314,57 +284,3 @@ const getReactTabContent = (activeReactTab: ReactTabKind, explorer: Explorer): J
throw Error(`Unsupported tab kind ${ReactTabKind[activeReactTab]}`);
}
};
const showMongoAndCassandraProxiesNetworkSettingsWarning = (): boolean => {
const ipRules: IpRule[] = userContext.databaseAccount?.properties?.ipRules;
if (
((userContext.apiType === "Mongo" && configContext.MONGO_PROXY_ENDPOINT !== MongoProxyEndpoints.Development) ||
(userContext.apiType === "Cassandra" &&
configContext.CASSANDRA_PROXY_ENDPOINT !== CassandraProxyEndpoints.Development)) &&
ipRules?.length
) {
const legacyPortalBackendIPs: string[] = PortalBackendIPs[configContext.BACKEND_ENDPOINT];
const ipAddressesFromIPRules: string[] = ipRules.map((ipRule) => ipRule.ipAddressOrRange);
const ipRulesIncludeLegacyPortalBackend: boolean = legacyPortalBackendIPs.every((legacyPortalBackendIP: string) =>
ipAddressesFromIPRules.includes(legacyPortalBackendIP),
);
if (!ipRulesIncludeLegacyPortalBackend) {
return false;
}
if (userContext.apiType === "Mongo") {
const isProdOrMpacMongoProxyEndpoint: boolean = [MongoProxyEndpoints.Mpac, MongoProxyEndpoints.Prod].includes(
configContext.MONGO_PROXY_ENDPOINT,
);
const mongoProxyOutboundIPs: string[] = isProdOrMpacMongoProxyEndpoint
? [...MongoProxyOutboundIPs[MongoProxyEndpoints.Mpac], ...MongoProxyOutboundIPs[MongoProxyEndpoints.Prod]]
: MongoProxyOutboundIPs[configContext.MONGO_PROXY_ENDPOINT];
const ipRulesIncludeMongoProxy: boolean = mongoProxyOutboundIPs.every((mongoProxyOutboundIP: string) =>
ipAddressesFromIPRules.includes(mongoProxyOutboundIP),
);
return !ipRulesIncludeMongoProxy;
} else if (userContext.apiType === "Cassandra") {
const isProdOrMpacCassandraProxyEndpoint: boolean = [
CassandraProxyEndpoints.Mpac,
CassandraProxyEndpoints.Prod,
].includes(configContext.CASSANDRA_PROXY_ENDPOINT);
const cassandraProxyOutboundIPs: string[] = isProdOrMpacCassandraProxyEndpoint
? [
...CassandraProxyOutboundIPs[CassandraProxyEndpoints.Mpac],
...CassandraProxyOutboundIPs[CassandraProxyEndpoints.Prod],
]
: CassandraProxyOutboundIPs[configContext.CASSANDRA_PROXY_ENDPOINT];
const ipRulesIncludeCassandraProxy: boolean = cassandraProxyOutboundIPs.every(
(cassandraProxyOutboundIP: string) => ipAddressesFromIPRules.includes(cassandraProxyOutboundIP),
);
return !ipRulesIncludeCassandraProxy;
}
}
return false;
};

View File

@@ -1,13 +1,11 @@
import { useBoolean } from "@fluentui/react-hooks";
import { userContext } from "UserContext";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import * as React from "react";
import ConnectImage from "../../../../images/HdeConnectCosmosDB.svg";
import ErrorImage from "../../../../images/error.svg";
import { AuthType } from "../../../AuthType";
import { BackendApi, HttpHeaders } from "../../../Common/Constants";
import { HttpHeaders } from "../../../Common/Constants";
import { configContext } from "../../../ConfigContext";
import { GenerateTokenResponse } from "../../../Contracts/DataModels";
import { isResourceTokenConnectionString } from "../Helpers/ResourceTokenUtils";
interface Props {
@@ -19,10 +17,6 @@ interface Props {
}
export const fetchEncryptedToken = async (connectionString: string): Promise<string> => {
if (!useNewPortalBackendEndpoint(BackendApi.GenerateToken)) {
return await fetchEncryptedToken_ToBeDeprecated(connectionString);
}
const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString);
const url = configContext.PORTAL_BACKEND_ENDPOINT + "/api/connectionstring/token/generatetoken";
@@ -35,28 +29,10 @@ export const fetchEncryptedToken = async (connectionString: string): Promise<str
return decodeURIComponent(encryptedTokenResponse);
};
export const fetchEncryptedToken_ToBeDeprecated = async (connectionString: string): Promise<string> => {
const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString);
const url = configContext.BACKEND_ENDPOINT + "/api/guest/tokens/generateToken";
const response = await fetch(url, { headers, method: "POST" });
if (!response.ok) {
throw response;
}
// This API has a quirk where it must be parsed twice
const result: GenerateTokenResponse = JSON.parse(await response.json());
return decodeURIComponent(result.readWrite || result.read);
};
export const isAccountRestrictedForConnectionStringLogin = async (connectionString: string): Promise<boolean> => {
const headers = new Headers();
headers.append(HttpHeaders.connectionString, connectionString);
const backendEndpoint: string = useNewPortalBackendEndpoint(BackendApi.AccountRestrictions)
? configContext.PORTAL_BACKEND_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const url = backendEndpoint + "/api/guest/accountrestrictions/checkconnectionstringlogin";
const url = configContext.PORTAL_BACKEND_ENDPOINT + "/api/guest/accountrestrictions/checkconnectionstringlogin";
const response = await fetch(url, { headers, method: "POST" });
if (!response.ok) {
throw response;

View File

@@ -16,6 +16,7 @@ export type Features = {
readonly enableAadDataPlane: boolean;
readonly enableResourceGraph: boolean;
readonly enableKoResourceTree: boolean;
readonly enableThroughputBuckets: boolean;
readonly hostedDataExplorer: boolean;
readonly junoEndpoint?: string;
readonly phoenixEndpoint?: string;
@@ -38,7 +39,6 @@ export type Features = {
readonly copilotChatFixedMonacoEditorHeight: boolean;
readonly enablePriorityBasedExecution: boolean;
readonly disableConnectionStringLogin: boolean;
readonly restoreTabs: boolean;
// can be set via both flight and feature flag
autoscaleDefault: boolean;
@@ -82,6 +82,7 @@ export function extractFeatures(given = new URLSearchParams(window.location.sear
enableSpark: "true" === get("enablespark"),
enableTtl: "true" === get("enablettl"),
enableKoResourceTree: "true" === get("enablekoresourcetree"),
enableThroughputBuckets: "true" === get("enablethroughputbuckets"),
executeSproc: "true" === get("dataexplorerexecutesproc"),
hostedDataExplorer: "true" === get("hosteddataexplorerenabled"),
mongoProxyEndpoint: get("mongoproxyendpoint"),
@@ -109,7 +110,6 @@ export function extractFeatures(given = new URLSearchParams(window.location.sear
copilotChatFixedMonacoEditorHeight: "true" === get("copilotchatfixedmonacoeditorheight"),
enablePriorityBasedExecution: "true" === get("enableprioritybasedexecution"),
disableConnectionStringLogin: "true" === get("disableconnectionstringlogin"),
restoreTabs: "true" === get("restoretabs"),
};
}

View File

@@ -41,13 +41,13 @@ const getDescriptor = async (selfServeType: SelfServeType): Promise<SelfServeDes
case SelfServeType.example: {
const SelfServeExample = await import(/* webpackChunkName: "SelfServeExample" */ "./Example/SelfServeExample");
const selfServeExample = new SelfServeExample.default();
await loadTranslations(selfServeExample.constructor.name);
await loadTranslations(selfServeType);
return selfServeExample.toSelfServeDescriptor();
}
case SelfServeType.sqlx: {
const SqlX = await import(/* webpackChunkName: "SqlX" */ "./SqlX/SqlX");
const sqlX = new SqlX.default();
await loadTranslations(sqlX.constructor.name);
await loadTranslations(selfServeType);
return sqlX.toSelfServeDescriptor();
}
case SelfServeType.graphapicompute: {
@@ -55,7 +55,7 @@ const getDescriptor = async (selfServeType: SelfServeType): Promise<SelfServeDes
/* webpackChunkName: "GraphAPICompute" */ "./GraphAPICompute/GraphAPICompute"
);
const graphAPICompute = new GraphAPICompute.default();
await loadTranslations(graphAPICompute.constructor.name);
await loadTranslations(selfServeType);
return graphAPICompute.toSelfServeDescriptor();
}
case SelfServeType.materializedviewsbuilder: {
@@ -63,7 +63,7 @@ const getDescriptor = async (selfServeType: SelfServeType): Promise<SelfServeDes
/* webpackChunkName: "MaterializedViewsBuilder" */ "./MaterializedViewsBuilder/MaterializedViewsBuilder"
);
const materializedViewsBuilder = new MaterializedViewsBuilder.default();
await loadTranslations(materializedViewsBuilder.constructor.name);
await loadTranslations(selfServeType);
return materializedViewsBuilder.toSelfServeDescriptor();
}
default:
@@ -103,7 +103,7 @@ const handleMessage = async (event: MessageEvent): Promise<void> => {
const urlSearchParams = new URLSearchParams(window.location.search);
const selfServeTypeText = urlSearchParams.get("selfServeType") || inputs.selfServeType;
const selfServeType = SelfServeType[selfServeTypeText?.toLowerCase() as keyof typeof SelfServeType];
const selfServeType = SelfServeType[selfServeTypeText.toLocaleLowerCase() as keyof typeof SelfServeType];
if (
!inputs.subscriptionId ||
!inputs.resourceGroup ||

View File

@@ -29,10 +29,11 @@ export enum SelfServeType {
// Unsupported self serve type passed as feature flag
invalid = "invalid",
// Add your self serve types here
example = "example",
sqlx = "sqlx",
graphapicompute = "graphapicompute",
materializedviewsbuilder = "materializedviewsbuilder",
// NOTE: text and casing of the enum's value must match the corresponding file in Localization\en\
example = "SelfServeExample",
sqlx = "SqlX",
graphapicompute = "GraphAPICompute",
materializedviewsbuilder = "MaterializedViewsBuilder",
}
/**

View File

@@ -89,3 +89,7 @@ export const getItemName = (): string => {
return "Items";
}
};
export const isDataplaneRbacSupported = (apiType: string): boolean => {
return apiType === "SQL" || apiType === "Tables";
};

View File

@@ -1,11 +1,4 @@
import {
BackendApi,
CassandraProxyEndpoints,
JunoEndpoints,
MongoProxyEndpoints,
PortalBackendEndpoints,
} from "Common/Constants";
import { configContext } from "ConfigContext";
import { CassandraProxyEndpoints, JunoEndpoints, MongoProxyEndpoints, PortalBackendEndpoints } from "Common/Constants";
import * as Logger from "../Common/Logger";
export function validateEndpoint(
@@ -73,9 +66,6 @@ export const PortalBackendIPs: { [key: string]: string[] } = {
//"https://main2.documentdb.ext.azure.com": ["104.42.196.69"],
"https://main.documentdb.ext.azure.cn": ["139.217.8.252"],
"https://main.documentdb.ext.azure.us": ["52.244.48.71"],
// Add ussec and usnat when endpoint address is known:
//ussec: ["29.26.26.67", "29.26.26.66"],
//usnat: ["7.28.202.68"],
};
export const PortalBackendOutboundIPs: { [key: string]: string[] } = {
@@ -100,14 +90,6 @@ export const defaultAllowedMongoProxyEndpoints: ReadonlyArray<string> = [
MongoProxyEndpoints.Mooncake,
];
export const allowedMongoProxyEndpoints_ToBeDeprecated: ReadonlyArray<string> = [
"https://main.documentdb.ext.azure.com",
"https://main.documentdb.ext.azure.cn",
"https://main.documentdb.ext.azure.us",
"https://main.cosmos.ext.azure",
"https://localhost:12901",
];
export const defaultAllowedCassandraProxyEndpoints: ReadonlyArray<string> = [
CassandraProxyEndpoints.Development,
CassandraProxyEndpoints.Mpac,
@@ -141,9 +123,7 @@ export const allowedArcadiaEndpoints: ReadonlyArray<string> = ["https://workspac
export const allowedHostedExplorerEndpoints: ReadonlyArray<string> = ["https://cosmos.azure.com/"];
export const allowedMsalRedirectEndpoints: ReadonlyArray<string> = [
"https://cosmos-explorer-preview.azurewebsites.net/",
];
export const allowedMsalRedirectEndpoints: ReadonlyArray<string> = ["https://dataexplorer-preview.azurewebsites.net/"];
export const allowedJunoOrigins: ReadonlyArray<string> = [
JunoEndpoints.Test,
@@ -155,53 +135,3 @@ export const allowedJunoOrigins: ReadonlyArray<string> = [
];
export const allowedNotebookServerUrls: ReadonlyArray<string> = [];
//
// Temporary function to determine if a portal backend API is supported by the
// new backend in this environment.
//
// TODO: Remove this function once new backend migration is completed for all environments.
//
export function useNewPortalBackendEndpoint(backendApi: string): boolean {
// This maps backend APIs to the environments supported by the new backend.
const newBackendApiEnvironmentMap: { [key: string]: string[] } = {
[BackendApi.GenerateToken]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.PortalSettings]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.AccountRestrictions]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.RuntimeProxy]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
[BackendApi.DisallowedLocations]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
PortalBackendEndpoints.Fairfax,
PortalBackendEndpoints.Mooncake,
],
[BackendApi.SampleData]: [
PortalBackendEndpoints.Development,
PortalBackendEndpoints.Mpac,
PortalBackendEndpoints.Prod,
],
};
if (!newBackendApiEnvironmentMap[backendApi] || !configContext.PORTAL_BACKEND_ENDPOINT) {
return false;
}
return newBackendApiEnvironmentMap[backendApi].includes(configContext.PORTAL_BACKEND_ENDPOINT);
}

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the Cassandra keyspaces under an existing Azure Cosmos DB database account. */
export async function listCassandraKeyspaces(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and collection. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given collection, split by partition. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given collection and region, split by partition. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account, collection and region. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and database. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account and region. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the properties of an existing Azure Cosmos DB database account. */
export async function get(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the graphs under an existing Azure Cosmos DB database account. */
export async function listGraphs(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the Gremlin databases under an existing Azure Cosmos DB database account. */
export async function listGremlinDatabases(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* List Cosmos DB locations and their properties */
export async function list(subscriptionId: string): Promise<Types.LocationListResult | Types.CloudError> {

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the MongoDB databases under an existing Azure Cosmos DB database account. */
export async function listMongoDBDatabases(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists all of the available Cosmos DB Resource Provider operations. */
export async function list(): Promise<Types.OperationListResult> {

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given partition key range id. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given partition key range id and region. */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given database account. This url is only for PBS and Replication Latency data */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given account, source and target region. This url is only for PBS and Replication Latency data */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Retrieves the metrics determined by the given filter for the given account target region. This url is only for PBS and Replication Latency data */
export async function listMetrics(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-05-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the SQL databases under an existing Azure Cosmos DB database account. */
export async function listSqlDatabases(

View File

@@ -3,13 +3,13 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
import { configContext } from "../../../../ConfigContext";
import { armRequest } from "../../request";
import * as Types from "./types";
const apiVersion = "2024-02-15-preview";
const apiVersion = "2024-12-01-preview";
/* Lists the Tables under an existing Azure Cosmos DB database account. */
export async function listTables(

View File

@@ -3,7 +3,7 @@
Run "npm run generateARMClients" to regenerate
Edting this file directly should be done with extreme caution as not to diverge from ARM REST specs
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-02-15-preview/cosmos-db.json
Generated from: https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/2024-12-01-preview/cosmos-db.json
*/
/* The List operation response, that contains the client encryption keys and their properties. */
@@ -553,6 +553,12 @@ export interface DatabaseAccountGetProperties {
/* The object that represents all properties related to capacity enforcement on an account. */
capacity?: Capacity;
/* Indicates the capacityMode of the Cosmos DB account. */
capacityMode?: CapacityMode;
/* The object that represents the migration state for the CapacityMode of the Cosmos DB account. */
capacityModeChangeTransitionState?: CapacityModeChangeTransitionState;
/* Flag to indicate whether to enable MaterializedViews on the Cosmos DB account */
enableMaterializedViews?: boolean;
/* The object that represents the metadata for the Account Keys of the Cosmos DB account. */
@@ -652,6 +658,9 @@ export interface DatabaseAccountCreateUpdateProperties {
/* The object that represents all properties related to capacity enforcement on an account. */
capacity?: Capacity;
/* Indicates the capacityMode of the Cosmos DB account. */
capacityMode?: CapacityMode;
/* Flag to indicate whether to enable MaterializedViews on the Cosmos DB account */
enableMaterializedViews?: boolean;
/* This property is ignored during the update/create operation, as the metadata is read-only. The object represents the metadata for the Account Keys of the Cosmos DB account. */
@@ -754,6 +763,9 @@ export interface DatabaseAccountUpdateProperties {
/* The object that represents all properties related to capacity enforcement on an account. */
capacity?: Capacity;
/* Indicates the capacityMode of the Cosmos DB account. */
capacityMode?: CapacityMode;
/* Flag to indicate whether to enable MaterializedViews on the Cosmos DB account */
enableMaterializedViews?: boolean;
/* This property is ignored during the update operation, as the metadata is read-only. The object represents the metadata for the Account Keys of the Cosmos DB account. */
@@ -1087,6 +1099,8 @@ export interface ThroughputSettingsResource {
readonly instantMaximumThroughput?: string;
/* The maximum throughput value or the maximum maxThroughput value (for autoscale) that can be specified */
readonly softAllowedMaximumThroughput?: string;
/* Array of Throughput Bucket limits to be applied to the Cosmos DB container */
throughputBuckets?: ThroughputBucketResource[];
}
/* Cosmos DB provisioned throughput settings object */
@@ -1114,6 +1128,14 @@ export interface ThroughputPolicyResource {
incrementPercent?: number;
}
/* Cosmos DB throughput bucket object */
export interface ThroughputBucketResource {
/* Represents the throughput bucket id */
id: number;
/* Represents maximum percentage throughput that can be used by the bucket */
maxThroughputPercentage: number;
}
/* Cosmos DB options resource object */
export interface OptionsResource {
/* Value of the Cosmos DB resource throughput or autoscaleSettings. Use the ThroughputSetting resource when retrieving offer details. */
@@ -1235,10 +1257,6 @@ export interface SqlDatabaseResource {
export interface SqlContainerResource {
/* Name of the Cosmos DB SQL container */
id: string;
vectorEmbeddingPolicy?: VectorEmbeddingPolicy;
fullTextPolicy?: FullTextPolicy;
/* The configuration of the indexing policy. By default, the indexing is automatic for all document paths within the container */
indexingPolicy?: IndexingPolicy;
@@ -1269,39 +1287,11 @@ export interface SqlContainerResource {
/* List of computed properties */
computedProperties?: ComputedProperty[];
}
export interface VectorEmbeddingPolicy {
vectorEmbeddings: VectorEmbedding[];
}
/* The vector embedding policy for the container. */
vectorEmbeddingPolicy?: VectorEmbeddingPolicy;
export interface VectorEmbedding {
path?: string;
dataType?: string;
dimensions?: number;
distanceFunction?: string;
}
export interface FullTextPolicy {
/**
* The default language for the full text .
*/
defaultLanguage: string;
/**
* The paths to be indexed for full text search.
*/
fullTextPaths: FullTextPath[];
}
export interface FullTextPath {
/**
* The path to be indexed for full text search.
*/
path: string;
/**
* The language for the full text path.
*/
language: string;
fullTextPolicy?: FullTextPolicy;
}
/* Cosmos DB indexing policy */
@@ -1323,19 +1313,14 @@ export interface IndexingPolicy {
/* List of spatial specifics */
spatialIndexes?: SpatialSpec[];
/* List of paths to include in the vector indexing */
vectorIndexes?: VectorIndex[];
fullTextIndexes?: FullTextIndex[];
}
export interface VectorIndex {
path?: string;
type?: string;
}
export interface FullTextIndex {
/** The path in the JSON document to index. */
path: string;
/* Cosmos DB Vector Embedding Policy */
export interface VectorEmbeddingPolicy {
/* List of vector embeddings */
vectorEmbeddings?: VectorEmbedding[];
}
/* undocumented */
@@ -1363,6 +1348,50 @@ export interface Indexes {
kind?: "Hash" | "Range" | "Spatial";
}
/* undocumented */
export interface VectorIndex {
/* The path to the vector field in the document. */
path: string;
/* The index type of the vector. Currently, flat, diskANN, and quantizedFlat are supported. */
type: "flat" | "diskANN" | "quantizedFlat";
}
/* Represents a vector embedding. A vector embedding is used to define a vector field in the documents. */
export interface VectorEmbedding {
/* The path to the vector field in the document. */
path: string;
/* Indicates the data type of vector. */
dataType: "float16" | "float32" | "uint8" | "int8";
/* The distance function to use for distance calculation in between vectors. */
distanceFunction: "euclidean" | "cosine" | "dotproduct";
/* The number of dimensions in the vector. */
dimensions: number;
}
export interface FullTextPolicy {
/**
* The default language for the full text .
*/
defaultLanguage: string;
/**
* The paths to be indexed for full text search.
*/
fullTextPaths: FullTextPath[];
}
export interface FullTextPath {
/**
* The path to be indexed for full text search.
*/
path: string;
/**
* The language for the full text path.
*/
language: string;
}
/* List of composite path */
export type CompositePathList = CompositePath[];
@@ -1688,6 +1717,28 @@ export interface Capacity {
totalThroughputLimit?: number;
}
/* Indicates the capacity mode of the account. */
export type CapacityMode = "None" | "Provisioned" | "Serverless";
/* The transition state information related capacity mode change with update request. */
export interface CapacityModeChangeTransitionState {
/* The transition status of capacity mode. */
capacityModeTransitionStatus?: "Invalid" | "Initialized" | "InProgress" | "Completed" | "Failed";
/* Indicates the current capacity mode of the account. */
currentCapacityMode?: "None" | "Provisioned" | "Serverless";
/* Indicates the previous capacity mode of the account before successful transition. */
previousCapacityMode?: "None" | "Provisioned" | "Serverless";
/* Begin time in UTC of the capacity mode change. */
readonly capacityModeTransitionBeginTimestamp?: string;
/* End time in UTC of the capacity mode change. */
readonly capacityModeTransitionEndTimestamp?: string;
/* End time in UTC of the last successful capacity mode change. */
readonly capacityModeLastSuccessfulTransitionEndTimestamp?: string;
}
/* Tags are a list of key-value pairs that describe the resource. These tags can be used in viewing and grouping this resource (across resource groups). A maximum of 15 tags can be provided for a resource. Each tag must have a key no greater than 128 characters and value no greater than 256 characters. For example, the default experience for a template type is set with "defaultExperience": "Cassandra". Current "defaultExperience" values also include "Table", "Graph", "DocumentDB", and "MongoDB". */
export type Tags = { [key: string]: string };
@@ -1953,8 +2004,8 @@ export type PublicNetworkAccess = "Enabled" | "Disabled" | "SecuredByPerimeter";
/* undocumented */
export interface ApiProperties {
/* Describes the ServerVersion of an a MongoDB account. */
serverVersion?: "3.2" | "3.6" | "4.0" | "4.2";
/* Describes the version of the MongoDB account. */
serverVersion?: "3.2" | "3.6" | "4.0" | "4.2" | "5.0" | "6.0" | "7.0";
}
/* Analytical storage specific properties. */

View File

@@ -13,7 +13,7 @@ import {
readSubComponentState,
} from "Shared/AppStatePersistenceUtility";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { isDataplaneRbacSupported } from "Utils/APITypeUtils";
import { logConsoleError } from "Utils/NotificationConsoleUtils";
import { useQueryCopilot } from "hooks/useQueryCopilot";
import { ReactTabKind, useTabs } from "hooks/useTabs";
@@ -85,9 +85,7 @@ export function useKnockoutExplorer(platform: Platform): Explorer {
await updateContextForSampleData(explorer);
}
if (userContext.features.restoreTabs) {
restoreOpenTabs();
}
restoreOpenTabs();
setExplorer(explorer);
}
@@ -98,7 +96,6 @@ export function useKnockoutExplorer(platform: Platform): Explorer {
useEffect(() => {
if (explorer) {
applyExplorerBindings(explorer);
explorer.openNPSSurveyDialog();
}
}, [explorer]);
@@ -184,6 +181,11 @@ async function configureFabric(): Promise<Explorer> {
}
const openFirstContainer = async (explorer: Explorer, databaseName: string, collectionName?: string) => {
if (useTabs.getState().openedTabs.length > 0) {
// Don't open any tabs if there are already tabs open
return;
}
// Expand database first
databaseName = sessionStorage.getItem("openDatabaseName") ?? databaseName;
const database = useDatabases.getState().databases.find((db) => db.id() === databaseName);
@@ -298,7 +300,7 @@ async function configureHostedWithAAD(config: AAD): Promise<Explorer> {
);
if (!userContext.features.enableAadDataPlane) {
Logger.logInfo(`AAD Feature flag is not enabled for account ${account.name}`, "Explorer/configureHostedWithAAD");
if (userContext.apiType === "SQL") {
if (isDataplaneRbacSupported(userContext.apiType)) {
if (LocalStorageUtility.hasItem(StorageKey.DataPlaneRbacEnabled)) {
const isDataPlaneRbacSetting = LocalStorageUtility.getEntryString(StorageKey.DataPlaneRbacEnabled);
Logger.logInfo(
@@ -546,20 +548,12 @@ async function configurePortal(): Promise<Explorer> {
const inputs = message?.inputs;
const openAction = message?.openAction;
if (inputs) {
if (
configContext.BACKEND_ENDPOINT &&
configContext.platform === Platform.Portal &&
process.env.NODE_ENV === "development"
) {
inputs.extensionEndpoint = configContext.PROXY_PATH;
}
updateContextsFromPortalMessage(inputs);
const { databaseAccount: account, subscriptionId, resourceGroup } = userContext;
let dataPlaneRbacEnabled;
if (userContext.apiType === "SQL") {
if (isDataplaneRbacSupported(userContext.apiType)) {
if (LocalStorageUtility.hasItem(StorageKey.DataPlaneRbacEnabled)) {
const isDataPlaneRbacSetting = LocalStorageUtility.getEntryString(StorageKey.DataPlaneRbacEnabled);
Logger.logInfo(
@@ -678,18 +672,17 @@ function updateAADEndpoints(portalEnv: PortalEnv) {
function updateContextsFromPortalMessage(inputs: DataExplorerInputsFrame) {
if (
configContext.BACKEND_ENDPOINT &&
configContext.PORTAL_BACKEND_ENDPOINT &&
configContext.platform === Platform.Portal &&
process.env.NODE_ENV === "development"
) {
inputs.extensionEndpoint = configContext.PROXY_PATH;
inputs.portalBackendEndpoint = configContext.PROXY_PATH;
}
const authorizationToken = inputs.authorizationToken || "";
const databaseAccount = inputs.databaseAccount;
updateConfigContext({
BACKEND_ENDPOINT: inputs.extensionEndpoint || configContext.BACKEND_ENDPOINT,
ARM_ENDPOINT: normalizeArmEndpoint(inputs.csmEndpoint || configContext.ARM_ENDPOINT),
MONGO_PROXY_ENDPOINT: inputs.mongoProxyEndpoint,
CASSANDRA_PROXY_ENDPOINT: inputs.cassandraProxyEndpoint,
@@ -792,17 +785,7 @@ async function updateContextForSampleData(explorer: Explorer): Promise<void> {
return;
}
let url: string;
if (useNewPortalBackendEndpoint(Constants.BackendApi.SampleData)) {
url = createUri(configContext.PORTAL_BACKEND_ENDPOINT, "/api/sampledata");
} else {
const sampleDatabaseEndpoint = useQueryCopilot.getState().copilotUserDBEnabled
? `/api/tokens/sampledataconnection/v2`
: `/api/tokens/sampledataconnection`;
url = createUri(`${configContext.BACKEND_ENDPOINT}`, sampleDatabaseEndpoint);
}
const url: string = createUri(configContext.PORTAL_BACKEND_ENDPOINT, "/api/sampledata");
const authorizationHeader = getAuthorizationHeader();
const headers = { [authorizationHeader.header]: authorizationHeader.token };
@@ -818,7 +801,7 @@ async function updateContextForSampleData(explorer: Explorer): Promise<void> {
const sampleDataConnectionInfo = parseResourceTokenConnectionString(data.connectionString);
updateUserContext({ sampleDataConnectionInfo });
await explorer.refreshSampleData();
explorer.refreshSampleData();
}
interface SampledataconnectionResponse {

View File

@@ -1,16 +1,9 @@
import { useEffect, useState } from "react";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { ApiEndpoints, BackendApi, HttpHeaders } from "../Common/Constants";
import { HttpHeaders } from "../Common/Constants";
import { configContext } from "../ConfigContext";
import { AccessInputMetadata } from "../Contracts/DataModels";
const url = `${configContext.BACKEND_ENDPOINT}${ApiEndpoints.guestRuntimeProxy}/accessinputmetadata?_=1609359229955`;
export async function fetchAccessData(portalToken: string): Promise<AccessInputMetadata> {
if (!useNewPortalBackendEndpoint(BackendApi.RuntimeProxy)) {
return fetchAccessData_ToBeDeprecated(portalToken);
}
const headers = new Headers();
// Portal encrypted token API quirk: The token header must be URL encoded
headers.append(HttpHeaders.guestAccessToken, encodeURIComponent(portalToken));
@@ -25,25 +18,6 @@ export async function fetchAccessData(portalToken: string): Promise<AccessInputM
.catch((error) => console.error(error));
}
export async function fetchAccessData_ToBeDeprecated(portalToken: string): Promise<AccessInputMetadata> {
const headers = new Headers();
// Portal encrypted token API quirk: The token header must be URL encoded
headers.append(HttpHeaders.guestAccessToken, encodeURIComponent(portalToken));
const options = {
method: "GET",
headers: headers,
};
return (
fetch(url, options)
.then((response) => response.json())
// Portal encrypted token API quirk: The response is double JSON encoded
.then((json) => JSON.parse(json))
.catch((error) => console.error(error))
);
}
export function useTokenMetadata(token: string): AccessInputMetadata | undefined {
const [state, setState] = useState<AccessInputMetadata | undefined>();

View File

@@ -1,4 +1,4 @@
import { AzureCliCredentials } from "@azure/ms-rest-nodeauth";
import { AzureCliCredential } from "@azure/identity";
import { expect, Frame, Locator, Page } from "@playwright/test";
import crypto from "crypto";
@@ -20,13 +20,13 @@ export function generateUniqueName(baseName, options?: TestNameOptions): string
return `${prefix}${baseName}${crypto.randomBytes(length).toString("hex")}${suffix}`;
}
export async function getAzureCLICredentials(): Promise<AzureCliCredentials> {
return await AzureCliCredentials.create();
export function getAzureCLICredentials(): AzureCliCredential {
return new AzureCliCredential();
}
export async function getAzureCLICredentialsToken(): Promise<string> {
const credentials = await getAzureCLICredentials();
const token = (await credentials.getToken()).accessToken;
const credentials = getAzureCLICredentials();
const token = (await credentials.getToken("https://management.core.windows.net//.default")).token;
return token;
}

View File

@@ -13,7 +13,7 @@ import {
} from "../fx";
test("SQL account using Resource token", async ({ page }) => {
const credentials = await getAzureCLICredentials();
const credentials = getAzureCLICredentials();
const armClient = new CosmosDBManagementClient(credentials, subscriptionId);
const accountName = getAccountName(TestAccount.SQL);
const account = await armClient.databaseAccounts.get(resourceGroupName, accountName);

View File

@@ -56,7 +56,7 @@ export class TestContainerContext {
export async function createTestSQLContainer(includeTestData?: boolean) {
const databaseId = generateUniqueName("db");
const containerId = "testcontainer"; // A unique container name isn't needed because the database is unique
const credentials = await getAzureCLICredentials();
const credentials = getAzureCLICredentials();
const armClient = new CosmosDBManagementClient(credentials, subscriptionId);
const accountName = getAccountName(TestAccount.SQL);
const account = await armClient.databaseAccounts.get(resourceGroupName, accountName);

View File

@@ -39,7 +39,6 @@ const initTestExplorer = async (): Promise<void> => {
csmEndpoint: "https://management.azure.com",
dnsSuffix: "documents.azure.com",
serverId: "prod1",
extensionEndpoint: "/proxy",
portalBackendEndpoint: "https://cdb-ms-mpac-pbe.cosmos.azure.com",
mongoProxyEndpoint: "https://cdb-ms-mpac-mp.cosmos.azure.com",
cassandraProxyEndpoint: "https://cdb-ms-mpac-cp.cosmos.azure.com",

View File

@@ -16,13 +16,13 @@ Results of this file should be checked into the repo.
*/
// CHANGE THESE VALUES TO GENERATE NEW CLIENTS
const version = "2024-02-15-preview";
/* The following are legal options for resourceName but you generally will only use cosmos-db:
"cosmos-db" | "managedCassandra" | "mongorbac" | "notebook" | "privateEndpointConnection" | "privateLinkResources" |
const version = "2024-12-01-preview";
/* The following are legal options for resourceName but you generally will only use cosmos:
"cosmos" | "managedCassandra" | "mongorbac" | "notebook" | "privateEndpointConnection" | "privateLinkResources" |
"rbac" | "restorable" | "services" | "dataTransferService"
*/
const githubResourceName = "cosmos-db";
const deResourceName = "cosmos-db";
const deResourceName = "cosmos";
const schemaURL = `https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/specification/cosmos-db/resource-manager/Microsoft.DocumentDB/preview/${version}/${githubResourceName}.json`;
const outputDir = path.join(__dirname, `../../src/Utils/arm/generatedClients/${deResourceName}`);

View File

@@ -1,4 +1,4 @@
const msRestNodeAuth = require("@azure/ms-rest-nodeauth");
const { AzureCliCredential } = require("@azure/identity");
const { CosmosDBManagementClient } = require("@azure/arm-cosmosdb");
const ms = require("ms");
@@ -16,7 +16,7 @@ function friendlyTime(date) {
}
async function main() {
const credentials = await msRestNodeAuth.AzureCliCredentials.create();
const credentials = new AzureCliCredential();
const client = new CosmosDBManagementClient(credentials, subscriptionId);
const accounts = await client.databaseAccounts.list(resourceGroupName);
for (const account of accounts) {

View File

@@ -30,7 +30,7 @@
<clear />
<add name="X-Xss-Protection" value="1; mode=block" />
<add name="X-Content-Type-Options" value="nosniff" />
<add name="Content-Security-Policy" value="frame-ancestors 'self' portal.azure.com *.portal.azure.com portal.azure.us portal.azure.cn portal.microsoftazure.de df.onecloud.azure-test.net *.fabric.microsoft.com *.powerbi.com *.analysis-df.windows.net cosmos-explorer-preview.azurewebsites.net" />
<add name="Content-Security-Policy" value="frame-ancestors 'self' portal.azure.com *.portal.azure.com portal.azure.us portal.azure.cn portal.microsoftazure.de df.onecloud.azure-test.net *.fabric.microsoft.com *.powerbi.com *.analysis-df.windows.net dataexplorer-preview.azurewebsites.net" />
</customHeaders>
<redirectHeaders>
<clear />

View File

@@ -20,9 +20,7 @@ const isCI = require("is-ci");
const gitSha = childProcess.execSync("git rev-parse HEAD").toString("utf8");
const AZURE_CLIENT_ID = "fd8753b0-0707-4e32-84e9-2532af865fb4";
const AZURE_TENANT_ID = "72f988bf-86f1-41af-91ab-2d7cd011db47";
const SUBSCRIPTION_ID = "69e02f2d-f059-4409-9eac-97e8a276ae2c";
const RESOURCE_GROUP = "de-e2e-tests";
const AZURE_CLIENT_SECRET = process.env.AZURE_CLIENT_SECRET || process.env.NOTEBOOKS_TEST_RUNNER_CLIENT_SECRET; // TODO Remove. Exists for backwards compat with old .env files. Prefer AZURE_CLIENT_SECRET
@@ -96,10 +94,7 @@ module.exports = function (_env = {}, argv = {}) {
if (mode === "development") {
envVars.NODE_ENV = "development";
envVars.AZURE_CLIENT_ID = AZURE_CLIENT_ID;
envVars.AZURE_TENANT_ID = AZURE_TENANT_ID;
envVars.AZURE_CLIENT_SECRET = AZURE_CLIENT_SECRET || null;
envVars.SUBSCRIPTION_ID = SUBSCRIPTION_ID;
envVars.RESOURCE_GROUP = RESOURCE_GROUP;
typescriptRule.use[0].options.compilerOptions = { target: "ES2018" };
}
@@ -187,7 +182,12 @@ module.exports = function (_env = {}, argv = {}) {
}),
new MonacoWebpackPlugin(),
new CopyWebpackPlugin({
patterns: [{ from: "DataExplorer.nuspec" }, { from: "web.config" }, { from: "quickstart/*.zip" }],
patterns: [
{ from: "DataExplorer.nuspec" },
{ from: "DataExplorer.proj" },
{ from: "web.config" },
{ from: "quickstart/*.zip" },
],
}),
new EnvironmentPlugin(envVars),
];
@@ -248,7 +248,8 @@ module.exports = function (_env = {}, argv = {}) {
new TerserPlugin({
terserOptions: {
// These options increase our initial bundle size by ~5% but the builds are significantly faster and won't run out of memory
compress: false,
// Update 2/11/2025: we are removing this flag as our bundles sizes grew so that it can remove dead and unreachable code with compromise of build time
// compress: false,
mangle: {
keep_fnames: true,
keep_classnames: true,