Compare commits

...

80 Commits

Author SHA1 Message Date
Sourabh Jain
a7e38201c4 refactored code 2025-04-07 21:26:02 +05:30
Sourabh Jain
80edb66fbf wip for private endpoint support 2025-04-04 22:41:45 +05:30
Sourabh Jain
c62e89228e updated code 2025-04-03 11:27:52 +05:30
Sourabh Jain
d68bf02a0e terminal log changes 2025-04-02 07:15:49 +05:30
Sourabh Jain
62e90ed26d vnet automation 2025-04-02 06:51:09 +05:30
Sourabh Jain
1f4e79f856 fix comands 2025-03-25 13:00:28 +05:30
Sourabh Jain
9db1edca03 fix vcore username 2025-03-25 08:04:22 +05:30
Sourabh Jain
8b4eaa95ea ui and functional changes 2025-03-25 00:24:17 +05:30
Sourabh Jain
10b0da2190 Vnet addition 2025-03-22 15:33:56 +05:30
Sourabh Jain
4313d6ecbd enable cloudshell with vnet config enabled 2025-03-18 22:47:31 +05:30
Sourabh Jain
83eafd4485 fix terminals 2025-03-18 17:13:49 +05:30
Sourabh Jain
44e85647e4 add other db support 2025-03-17 06:35:19 +05:30
Sourabh Jain
ec891671b6 code refactor 2025-02-28 07:55:48 +05:30
Sourabh Jain
942de980c3 code refactor 2025-02-25 12:09:31 +05:30
Sourabh Jain
2c3c4e7db7 added consent 2025-02-25 11:23:42 +05:30
Sourabh Jain
9b2cb8a1a9 refactor code and add casandra and mongo commands 2025-02-24 23:24:25 +05:30
Sourabh Jain
41439cc7d4 refactor code 2025-02-24 15:10:07 +05:30
Sourabh Jain
ce08ce05f2 mongo is working 2025-02-21 21:09:44 +05:30
Sourabh Jain
323276beff cloudshell api failed with client id 2025-02-19 07:00:41 +05:30
Sourabh Jain
1678ec0a23 xterm add 2025-02-18 17:11:39 +05:30
Sourabh Jain
0babb1fa13 reverted code 2025-02-18 07:57:30 +05:30
Sourabh Jain
78c8df0904 Not wroking code 2025-02-18 07:53:52 +05:30
Sourabh Jain
76742455bf first draft 2025-02-17 07:12:49 +05:30
bogercraig
2730da7ab6 Backend Migration - Remove Use of Legacy Backend from DE (#2043)
* Default to new backend endpoint if the endpoint in current context does not match existing set in constants.

* Remove some env references.

* Added comments with reasoning for selecting new backend by default.

* Update comment.

* Remove all references to useNewPortalBackendEndpoint now that old backend is disabled in all environments.

* Resolve lint issues.

* Removed references to old backend from Cassandra and Mongo Apis

* fix unit tests

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-12 18:12:59 -08:00
sunghyunkang1111
de2449ee25 Adding throughput bucket settings in Data Explorer (#2044)
* Added throughput bucketing

* fix bugs

* enable/disable per autoscale selection

* Added logic

* change query bucket to group

* Updated to a tab

* Fixed unit tests

* Edit package-lock

* Compile build fix

* fix unit tests

* moving the throughput bucket flag to the client generation level
2025-02-12 13:10:07 -06:00
sunghyunkang1111
99378582ce Remove blocking await on sample database (#2047)
* Remove blocking await on sample database

* Remove compress flag to reduce bundle size

* Fix typo in webpack config comment date
2025-02-12 13:09:52 -06:00
SATYA SB
bd592d07af [accessibility-1217621]: Keyboard focus gets lost on the page which opens after activating "Data Explorer" menu item present under 'Overview' page. (#1927)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-02-12 11:31:30 +05:30
asier-isayas
644f5941ec Set default throughput based on account's workload type (#2021)
* assign default throughput based on workload type

* combined common logic

* fix unit tests

* add tests

* update tests

* npm run format

* Update ci.yml

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2025-02-11 17:47:55 -05:00
jawelton74
9fb006a996 Restore DisplayNPSSurvey message type enum which was removed in a prior (#2046)
change.
2025-02-11 06:58:44 -08:00
jawelton74
c2b98c3e23 Modify E2E cleanup script to use @azure/identity for AZ credentials. (#2051) 2025-02-10 08:48:26 -08:00
Nishtha Ahuja
76d49d86d4 Added emulator checks in settings pane fields (#2041)
* added emulator checks

* created macro

* conditions as const

---------

Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-10 11:52:56 +05:30
Laurent Nguyen
7893b89bf7 Do not open first container if a tab is already open (#2045)
Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2025-02-06 21:58:38 +01:00
JustinKol
5945e3cb6b Removed NPS Survey from DE since it has been moved to the Overview Blade (#2027)
* Removed NPS Survey from DE since it has been moved to the Overview Blade

* Added ExplorerBindings back

* Moved applyExplorerBindings back to original place
2025-02-05 13:30:03 -05:00
Laurent Nguyen
213d1c68fe Remove feature switch on restore tabs (#2039) 2025-02-03 17:59:00 +01:00
Nishtha Ahuja
c26f9a1ebb disabled change buttom for emulator (#2017)
Co-authored-by: Nishtha Ahuja <nishthaahuja@microsoft.com>
2025-02-03 12:39:01 +05:30
SATYA SB
bd7cd7ae8f [accessibility-3556793]: [Screen Reader- Azure Cosmos DB- Data Explorer]: The Learn more links are not descriptive present under the settings. (#2035)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:58:44 +05:30
SATYA SB
6504358580 [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings. (#2016)
* [accessibility-3556824] : [Programmatic Access - Azure Cosmos DB- Data Explorer]: Keyboard focus indicator is not visible on controls inside the settings.

* Snapshots updated.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:53:18 +05:30
SATYA SB
ce88659fca [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button. (#2010)
* [accessibility-3560073]: [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

* [Keyboard Navigation - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Keyboard focus order is not logical after selecting the 'Copy code' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:48:34 +05:30
SATYA SB
642c708e9c [accessibility-3556756]: [Programmatic Access- Azure Cosmos DB- Data explorer]: Ensures <img> elements have alternate text or a role of none or presentation. (#2007)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:45:49 +05:30
SATYA SB
4156009d09 [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button. (#2002)
* [accessibility-3549715]: [Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

* [accessibility-3549715]:[Screen reader - Cosmos DB Query Copilot - Query Faster with Copilot>Enable Query Advisor]: Screen reader does not announce status information which appears on invoking the 'Send' button.

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:44:32 +05:30
SATYA SB
5c6abbd635 [accessibility-3556595]: [Programmatic Access- Azure Cosmos DB- Data Explorer]: Ensures role attribute has an appropriate value for the element. (#2001)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2025-01-31 10:37:11 +05:30
jawelton74
881726e9af New preview site (#2036)
* Changes to DE preview site to support managed identity. Changes to
infrastructure to use new preview site.

* Fix formatting.

* Potential fix for code scanning alert no. 56: Server-side request forgery

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* Use different secrets for subscription/tenant/client id's.

* Revert new id names.

* Update Az CLI config.

* Update to Node 18 and update security vulnerable dependencies.

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-01-30 16:14:03 -08:00
jawelton74
7015590d1a Remove hard coded client and subscription Ids from webpack config. (#2033) 2025-01-24 07:23:33 -08:00
jawelton74
1d952a4ea2 Remove throughput survey text and link from Throughput tab. (#2031) 2025-01-21 10:53:47 -08:00
jawelton74
2a81551a60 Use unique names in upload artifacts tasks (#2030)
* Specify actual package names in upload artifacts task.

* Revert path change, use unique names for upload task.

* Fix the right properties.

* Revert condition change
2025-01-21 07:07:53 -08:00
jawelton74
eceee36913 Use azure identity package for e2e test credentials (#2032)
* Update identity package, remove ms-rest-nodeauth package.

* Test changes to use identity package.
2025-01-21 07:07:18 -08:00
jawelton74
96faf92c12 Use dotnet CLI for nuget operations in CI pipeline (#2026)
* Start of moving nuget actions to use dotnet.

* Comment out env section

* Set auth token.

* Disable globalization support.

* Comment out dotnet setup.

* Copy proj file with build.

* PLace Content item under ItemGroup.

* Update project with Sdk and No Build args.

* Remove no-build from cmd line.

* Set TargetFramework version.

* Fix TargetFramework value.

* Add nuget push command.

* Fix test version string

* Add nuget add source step.

* Fix add source args.

* Enable cleartext password, remove source after completion.

* Use wildcard for nupkg path. Add debug.

* Remove debug.

* Fix nupkg path

* Fix API key argument

* Re-enable MPAC nuget. Tidy up ci.yml.

* Fix formatting of webpack config.

* Remove Globalization flag.

* Revert test changes.
2025-01-15 11:37:30 -08:00
jawelton74
2fdb3df4ae Update to Nuget setup action v2. (#2024) 2025-01-10 17:32:12 -08:00
bogercraig
7c9802c07d Settings Menu Client Refresh Bug Fix, Limit Client Options in APIs (#2023)
* Reset hasDataPlaneRbacSettingChanged back to false after cosmos client is refreshed with new settings.
Dispose of old client before new one is created.

* Update client refresh variable after settings change.

* Only refresh client when related settings are changed.

* Update comparisons in settings menu.

* Remove unnecessary comments.

* Update refresh variable naming.

* Attempting to sync package.json and package-lock.json in CI.

* Remove npm install from CI after successful CI run.

* Only show retry settings with those APIs using the cosmos client -> NoSQL, Table, Gremlin
2025-01-10 12:42:03 -08:00
jawelton74
e5609bd91e Update Playwright to latest and rename MongoProxy development endpoint constant (#2022)
* Rename MongoProxy development endpoint constant to be consistent with other
endpoints.

* Update Playwright version to latest release due to test setup break.
2025-01-08 07:56:04 -08:00
jawelton74
4b75e86b74 Remove Network Warning banner from Data Explorer. (#2019) 2024-12-12 15:44:28 -08:00
SATYA SB
abf061089d [accessibility-3102896]: [Keyboard Navigation - Azure CosmosDB - Data Explorer]: "Learn more" link is not accessible using keyboard present inside the tooltip of "Analytical store’. (#1989)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2024-12-10 10:14:19 +05:30
Laurent Nguyen
ec25586a6e Close all tabs and always load first container when initializing Fabric (#2014)
Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2024-12-06 17:14:37 +01:00
tarazou9
c15d1432b2 Fix the filter for getting offering id (#2015)
Fix offering id filter
2024-12-05 13:04:46 -05:00
Laurent Nguyen
73d2686025 Restore open collection tabs: Query, Documents, Settings (#2004)
* Persist query multiple query texts

* Save multiple query tab histories

* Save and restore states for QueryTab and DocumentsTab for SQL and Mongo

* Enable Collection Scale/Settings restore

* Persist documents tab current filter

* Fix DocumentsTab conflict resolve mistake

* Remove unused variable

* Fix e2e test

* Fix e2e localStorage reference

* Try clearing local storage via playwright page

* Clear local storage after opening page

* Move restore flag behind feature flag. Whitelist restorable tabs in for Fabric. Restore e2e tests.

* Fix typo

* Fix: avoid setting undefined for preferredSize for the <Allotment.Pane>

* Add comments

* Move restore tabs after knockout configure step from Explorer constructor (which could be called multiple times)
2024-11-28 11:18:55 +01:00
vchske
80b926214b Vector Embedding and Full Text Search (#2009)
* Replaced monaco editor on Container Vector Policy tab with controls same as on create container ux

* Adds vector embedding policy to container management. Adds FullTextSearch to both add container and container management.

* Fixing unit tests and formatting issues

* More fixes

* Updating full text controls based on feedback

* Minor updates

* Editing test to fix compile issue

* Minor fix

* Adding paths for jest to ignore transform due to recent changes in upstream dependencies

* Adding mock to temporarily get unit tests to pass

* Hiding FTS feature behind the new EnableNoSQLFullTextSearch capability
2024-11-18 12:49:27 -08:00
Laurent Nguyen
070b7a4ca7 Remove unnecessary padding for Fabric (#2005)
Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2024-11-13 09:52:51 +01:00
jawelton74
d61ff5dcb5 Update upload/download-artifact actions to v4 due to v3 deprecation. (#2006) 2024-11-11 11:17:48 -08:00
Laurent Nguyen
d42eebaa5a Improve DocumentsTab filter input (#1998)
* Rework Input and dropdown in DocumentsTab

* Improve input: implement Escape and add clear button

* Undo body :focus outline, since fluent UI has a nicer focus style

* Close dropdown if last element is tabbed

* Fix unit tests

* Fix theme and remove autocomplete

* Load theme inside rendering function to fix using correct colors

* Remove commented code

* Add aria-label to clear filter button

* Fix format

* Fix keyboard navigation with tab and arrow up/down. Clear button becomes down button.

---------

Co-authored-by: Laurent Nguyen <languye@microsoft.com>
2024-11-01 16:59:26 +01:00
Ashley Stanton-Nurse
056be2a74d add more edge cases to Query Error parser (#2003) 2024-10-30 08:43:18 -07:00
jawelton74
b93c90e7d1 Update query advisor privacy statement and link (#2000)
* Update query advisor privacy details.

* Update test snapshot.
2024-10-29 10:56:04 -07:00
Laurent Nguyen
82de81f2b6 Fix row selection issue in DocumentsTab when sorting rows (#1997)
* Fix bug clicking on item highlights wrong row. Remove unused prop.

* Fix clicking on table row on sorted rows and multi-select using ctrl

* Update test snaphosts

* Remove unnecessary setTimeout
2024-10-25 12:08:55 +02:00
Ashley Stanton-Nurse
236f075cf6 fix layout issues in fabric (#1996) 2024-10-24 09:59:03 -07:00
bogercraig
d478af3869 Correct spelling of CreateDocumen to CreateDocuments (#1995) 2024-10-22 14:23:07 -07:00
Laurent Nguyen
93c1fdc238 Revert "Persist and restore query text, tab position and splitter direction in QueryTabComponent (#1993)" (#1994)
This reverts commit d562fc0f40.
2024-10-22 19:03:10 +02:00
Laurent Nguyen
d562fc0f40 Persist and restore query text, tab position and splitter direction in QueryTabComponent (#1993)
* Save query text, tab splitter direction and position in QueryTabComponent

* Fix unit tests
2024-10-22 14:31:09 +02:00
bogercraig
808faa9fa5 CP and MP API Overrides from Config.json (#1992)
* Force useMongoProxyEndpoint to always return true if valid endpoint provided.  Enables new Mongo proxy in all environments.

* Checking MP endpoint in config context.

* Enabling cassandra proxy in all environments.  Requires later cleanup.

* Simplifying and removing endpoint validation since run when config context is generated.

* Enabling one MP API at a time globally.

* Revent to existing CP selection logic.

* Creating list of globally enable CP apis.

* Add list of mongo and cassandra APIs to config and only enable if environment outside existing list of environments.

* Remove environment checks.  If API globally enabled, return true.

* Adding config initialization for mongo unit tests.

* Default to empty enable list to minimize possible impact.  Config.json overrides can be used for testing.
2024-10-17 13:41:22 -07:00
Vsevolod Kukol
c1bc11d27d Support multi-tenant switching for Data Plane RBAC (#1988)
* Fix API endpoint for CassandraProxy query API

* activate Mongo Proxy and Cassandra Proxy in Prod

* Add CP Prod endpoint

* Run npm format and tests

* Revert code

* fix bug that blocked local mongo proxy and cassandra proxy development

* Add prod endpoint

* fix pr check tests

* Remove prod

* Remove prod endpoint

* Remove dev endpoint

* Support data plane RBAC

* Support data plane RBAC

* Add additional changes for Portal RBAC functionality

* Remove unnecessary code

* Remove unnecessary code

* Add code to fix VCoreMongo/PG bug

* Address feedback

* Add more logs for RBAC feature

* Add more logs for RBAC features

* Add AAD endpoints for all environments

* Add AAD endpoints

* Run npm format

* Support multi-tenant switching for Data Plane RBAC

* Remove tenantID duplicates

---------

Co-authored-by: Senthamil Sindhu <sindhuba@microsoft.com>
Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2024-10-10 07:36:19 -07:00
sindhuba
ac2e2a6f8e Add tenantId info in Data Explorer while opening from Portal (#1987)
* Fix API endpoint for CassandraProxy query API

* activate Mongo Proxy and Cassandra Proxy in Prod

* Add CP Prod endpoint

* Run npm format and tests

* Revert code

* fix bug that blocked local mongo proxy and cassandra proxy development

* Add prod endpoint

* fix pr check tests

* Remove prod

* Remove prod endpoint

* Remove dev endpoint

* Support data plane RBAC

* Support data plane RBAC

* Add additional changes for Portal RBAC functionality

* Remove unnecessary code

* Remove unnecessary code

* Add code to fix VCoreMongo/PG bug

* Address feedback

* Add more logs for RBAC feature

* Add more logs for RBAC features

* Add AAD endpoints for all environments

* Add AAD endpoints

* Run npm format

* Support multi-tenant switching for Data plane RBAC

* Run npm format

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2024-10-09 15:41:58 -07:00
Laurent Nguyen
3138580eae Move column selection out of mpac (#1980) 2024-10-09 14:23:31 +02:00
SATYA SB
aa88815c6e [Keyboard Navigation - Azure Cosmos DB - Data Explorer]: Keyboard focus is not retaining back to 'more' button after closing 'Delete container' dialog. (#1978)
* [accessibility-2262594]: [Keyboard Navigation - Azure Cosmos DB - Data Explorer]: Keyboard focus is not retaining back to 'more' button after closing 'Delete container' dialog.

* Optimize closeSidePanel: add timeout cleanup to prevent memory leaks and ensure proper focus behavior

---------

Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2024-10-07 09:09:23 +05:30
Vsevolod Kukol
5a2f78b51e Improve Entra ID token acquisition logic (#1940)
* Add a silent parameter to acquireTokenWithMsal

If true, the function won't retry to sign in using a Popup if silent token acquisition fails.

* Improve Login for Entra ID RBAC button logic

Try to reuse an existing signed-in MSAL account to get the AAD token
and fall back to full sign-in otherwise.

Also move the logic to AuthorizationUtils

* Try to acquire an Entra ID token silently on startup.

When running in Portal MSAL should be able to reuse the
MSAL account from Portal and allow us to silently get
the RBAC token. If it fails we'll show the Login for Entry ID RBAC
button as usual.

* Small code improvements

* Remove the RBAC notice from settings pane
and try to acquire RBAC token silently after enabling RBAC.

* Use msal.ssoSilent with an optional login hint
to avoid more sign-in popups.
msal.loginPopup will be used as a backup option if ssoSilent fails.
Ideally the parent environment (Portal/Fabric) should send
a loginHint with the username of the currently signed in user that
can be passed to the token acquisition flow.

* Improve RBAC error wording, clarifying where to find the Login button.
2024-10-04 08:45:29 +02:00
bogercraig
fbc2e1335b Pull Additional Allowed Cassandra and Mongo Proxy Endpoints from Deployed Config (#1984)
* Updating to take default cassandra proxy endpoints from external config.json.

* Updating allow list for mongo proxy endpoints.
2024-10-02 14:05:21 -07:00
SATYA SB
eb0d7b71b3 [accessibility-3100029]:[Screen Reader - Azure Cosmos DB - Add Table Row]: Descriptive Label is not provided for 'Value' edit fields under 'Add Table Row' pane. (#1970)
Co-authored-by: Satyapriya Bai <v-satybai@microsoft.com>
2024-10-01 09:15:25 +05:30
Asier Isayas
261289b031 Remove legacy backend references in tests and local dev (#1983)
* remove legacy backend references in tests and local dev

* fix unit tests

* fixed bulk delete

* fix tests

* fix cosmosclient

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2024-09-30 14:34:37 -04:00
Asier Isayas
fae4589427 Bulk Delete API fix (#1977)
* Bulk Delete API fix

* Bulk Delete API fix

---------

Co-authored-by: Asier Isayas <aisayas@microsoft.com>
2024-09-30 11:40:48 -04:00
jawelton74
cbcb7e6240 Switch E2E tests to use new accounts. (#1982) 2024-09-30 07:29:24 -07:00
jawelton74
e0b773d920 Set shared throughput default to false for New Databases (#1981)
* Introduce common function for shared throughput default and set to
false.

* Add new file.

* Adjust E2E tests to not set throughput for database create.
2024-09-27 09:59:41 -07:00
Ashley Stanton-Nurse
9ec2cea95c Ensure the "Ctrl+Alt+["/"Ctrl+Alt+]" shortcuts don't get triggered on "AltGr+8"/"AltGr+9" (#1979)
* Remove the "Ctrl+Alt+[" and "Ctrl+Alt+]" shortcuts, as they conflict on non-US keyboard layouts

* Use "BracketLeft" and "BracketRight" to re-enable shortcut for US keyboards
2024-09-25 09:15:54 -07:00
Ashley Stanton-Nurse
1a4f713a79 Clarifying copy-edit to delete database panel (#1974)
* change 'Database id' to 'Database name' in Delete Database confirm prompt

* put 'name' in a parenthetical instead of replacing 'id'

* update test snapshots
2024-09-23 11:34:49 -07:00
174 changed files with 8701 additions and 3823 deletions

View File

@@ -1 +1 @@
[Preview this branch](https://cosmos-explorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true)
[Preview this branch](https://dataexplorer-preview.azurewebsites.net/pull/EDIT_THIS_NUMBER_IN_THE_PR_DESCRIPTION?feature.someFeatureFlagYouMightNeed=true)

View File

@@ -83,7 +83,7 @@ jobs:
- run: npm ci
- run: npm run build:contracts
- name: Restore Build Cache
uses: actions/cache@v2
uses: actions/cache@v4
with:
path: .cache
key: ${{ runner.os }}-build-cache
@@ -92,18 +92,20 @@ jobs:
NODE_OPTIONS: "--max-old-space-size=4096"
- run: cp -r ./Contracts ./dist/contracts
- run: cp -r ./configs ./dist/configs
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: dist
path: dist/
- name: "Az CLI login"
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.PREVIEW_SUBSCRIPTION_ID }}
- name: Upload build to preview blob storage
run: az storage blob upload-batch -d '$web' -s 'dist' --account-name cosmosexplorerpreview --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
run: az storage blob upload-batch -d '$web' -s 'dist' --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --destination-path "${{github.event.pull_request.head.sha || github.sha}}" --auth-mode login --overwrite true
- name: Upload preview config to blob storage
run: az storage blob upload -c '$web' -f ./preview/config.json --account-name cosmosexplorerpreview --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --account-key="${PREVIEW_STORAGE_KEY}" --overwrite true
env:
PREVIEW_STORAGE_KEY: ${{ secrets.PREVIEW_STORAGE_KEY }}
run: az storage blob upload -c '$web' -f ./preview/config.json --account-name ${{ secrets.PREVIEW_STORAGE_ACCOUNT_NAME }} --name "${{github.event.pull_request.head.sha || github.sha}}/config.json" --auth-mode login --overwrite true
nuget:
name: Publish Nuget
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -113,21 +115,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps:
- uses: nuget/setup-nuget@v1
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: dist
- run: cp ./configs/prod.json config.json
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT"
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg
- uses: actions/upload-artifact@v3
name: packages
- run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4
name: Upload package to Artifacts
with:
path: "*.nupkg"
name: prod-package
path: "bin/Release/*.nupkg"
nugetmpac:
name: Publish Nuget MPAC
if: github.ref == 'refs/heads/master' || contains(github.ref, 'hotfix/') || contains(github.ref, 'release/')
@@ -137,22 +139,21 @@ jobs:
NUGET_SOURCE: ${{ secrets.NUGET_SOURCE }}
AZURE_DEVOPS_PAT: ${{ secrets.AZURE_DEVOPS_PAT }}
steps:
- uses: nuget/setup-nuget@v1
with:
nuget-api-key: ${{ secrets.NUGET_API_KEY }}
- name: Download Dist Folder
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: dist
- run: cp ./configs/mpac.json config.json
- run: sed -i 's/Azure.Cosmos.DB.Data.Explorer/Azure.Cosmos.DB.Data.Explorer.MPAC/g' DataExplorer.nuspec
- run: nuget sources add -Name "ADO" -Source "$NUGET_SOURCE" -UserName "jawelton@microsoft.com" -Password "$AZURE_DEVOPS_PAT"
- run: nuget pack -Version "2.0.0-github-${GITHUB_SHA}"
- run: nuget push -SkipDuplicate -Source "$NUGET_SOURCE" -ApiKey Az *.nupkg
- uses: actions/upload-artifact@v3
name: packages
- run: dotnet nuget add source "$NUGET_SOURCE" --name "ADO" --username "jawelton@microsoft.com" --password "$AZURE_DEVOPS_PAT" --store-password-in-clear-text
- run: dotnet pack DataExplorer.proj /p:PackageVersion="2.0.0-github-${GITHUB_SHA}"
- run: dotnet nuget push "bin/Release/*.nupkg" --skip-duplicate --api-key Az --source="$NUGET_SOURCE"
- run: dotnet nuget remove source "ADO"
- uses: actions/upload-artifact@v4
name: Upload package to Artifacts
with:
path: "*.nupkg"
name: mpac-package
path: "bin/Release/*.nupkg"
playwright-tests:
name: "Run Playwright Tests (Shard ${{ matrix.shardIndex }} of ${{ matrix.shardTotal }})"
@@ -185,9 +186,9 @@ jobs:
if: ${{ !cancelled() }}
uses: actions/upload-artifact@v4
with:
name: blob-report-${{ matrix.shardIndex }}
path: blob-report
retention-days: 1
name: blob-report-${{ matrix.shardIndex }}
path: blob-report
retention-days: 1
merge-playwright-reports:
name: "Merge Playwright Reports"
@@ -197,26 +198,26 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
- name: Install dependencies
run: npm ci
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
- name: Install dependencies
run: npm ci
- name: Download blob reports from GitHub Actions Artifacts
uses: actions/download-artifact@v4
with:
path: all-blob-reports
pattern: blob-report-*
merge-multiple: true
- name: Download blob reports from GitHub Actions Artifacts
uses: actions/download-artifact@v4
with:
path: all-blob-reports
pattern: blob-report-*
merge-multiple: true
- name: Merge into HTML Report
run: npx playwright merge-reports --reporter html ./all-blob-reports
- name: Merge into HTML Report
run: npx playwright merge-reports --reporter html ./all-blob-reports
- name: Upload HTML report
uses: actions/upload-artifact@v4
with:
name: html-report--attempt-${{ github.run_attempt }}
path: playwright-report
retention-days: 14
- name: Upload HTML report
uses: actions/upload-artifact@v4
with:
name: html-report--attempt-${{ github.run_attempt }}
path: playwright-report
retention-days: 14

9
DataExplorer.proj Normal file
View File

@@ -0,0 +1,9 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<NoBuild>true</NoBuild>
<IncludeBuildOutput>false</IncludeBuildOutput>
<NuspecFile>DataExplorer.nuspec</NuspecFile>
<NuspecProperties>version=$(PackageVersion)</NuspecProperties>
</PropertyGroup>
</Project>

View File

@@ -18,7 +18,7 @@ Run `npm start` to start the development server and automatically rebuild on cha
### Hosted Development (https://cosmos.azure.com)
- Visit: `https://localhost:1234/hostedExplorer.html`
- The default webpack dev server configuration will proxy requests to the production portal backend: `https://main.documentdb.ext.azure.com`. This will allow you to use production connection strings on your local machine.
- The default webpack dev server configuration will proxy requests to the production portal backend: `https://cdb-ms-mpac-pbe.cosmos.azure.com`. This will allow you to use production connection strings on your local machine.
### Emulator Development

View File

@@ -82,7 +82,7 @@
</a>
<ul>
<li>Visit: <code>https://localhost:1234/hostedExplorer.html</code></li>
<li>The default webpack dev server configuration will proxy requests to the production portal backend: <code>https://main.documentdb.ext.azure.com</code>. This will allow you to use production connection strings on your local machine.</li>
<li>The default webpack dev server configuration will proxy requests to the production portal backend: <code>https://cdb-ms-mpac-pbe.cosmos.azure.com</code>. This will allow you to use production connection strings on your local machine.</li>
</ul>
<a href="#emulator-development" id="emulator-development" style="color: inherit; text-decoration: none;">
<h3>Emulator Development</h3>

View File

@@ -174,7 +174,11 @@ module.exports = {
},
// An array of regexp pattern strings that are matched against all source file paths, matched files will skip transformation
transformIgnorePatterns: ["/node_modules/(?!@fluentui/react-icons)", "/externals/"],
transformIgnorePatterns: [
"/node_modules/(?!@fluentui/react-icons|(.*)/dist/browser)/",
"/node_modules/plotly.js-cartesian-dist-min",
"/externals/",
],
// An array of regexp pattern strings that are matched against all modules before the module loader will automatically return a mock for them
// unmockedModulePathPatterns: undefined,

View File

@@ -1830,6 +1830,14 @@ input::-webkit-calendar-picker-indicator::after {
transform: rotate(90deg);
}
.customAccordion button:focus {
.focus();
}
.customAccordion {
margin-top: 1px;
}
.datalist-arrow:after:hover {
content: "\276F";
position: absolute;
@@ -3117,3 +3125,7 @@ a:link {
background: white;
height: 100%;
}
.sidebarContainer {
height: 100%;
}

View File

@@ -20,6 +20,10 @@ a:focus {
text-decoration: underline;
}
.splashLoaderContainer {
background-color: #f5f5f5;
}
#divExplorer {
background-color: #f5f5f5;
}
@@ -27,26 +31,24 @@ a:focus {
.resourceTreeAndTabs {
border-radius: 0px;
box-shadow: @FabricBoxBorderShadow;
margin: @FabricBoxMargin;
margin-top: 0px;
margin-bottom: 0px;
background-color: #ffffff;
}
.tabsManagerContainer {
background-color: #ffffff
background-color: #ffffff;
}
.nav-tabs-margin {
padding-top: 5px;
background-color: #ffffff
background-color: #ffffff;
}
.commandBarContainer {
background-color: #ffffff;
border-radius: @FabricBoxBorderRadius @FabricBoxBorderRadius 0px 0px;
box-shadow: @FabricBoxBorderShadow;
margin: @FabricBoxMargin;
margin-top: 0px;
margin-bottom: 0px;
padding-top: 2px;
@@ -65,17 +67,16 @@ a:focus {
}
}
.nav-tabs>li>.tabNavContentContainer>.tab_Content:hover {
.nav-tabs > li > .tabNavContentContainer > .tab_Content:hover {
border-bottom: 2px solid #e0e0e0;
}
.nav-tabs>li.active>.tabNavContentContainer>.tab_Content,
.nav-tabs>li.active>.tabNavContentContainer>.tab_Content:hover {
.nav-tabs > li.active > .tabNavContentContainer > .tab_Content,
.nav-tabs > li.active > .tabNavContentContainer > .tab_Content:hover {
border-bottom: 2px solid @FabricAccentMedium;
}
.nav-tabs>li.active>.tabNavContentContainer>.tab_Content>.contentWrapper>.tabNavText {
.nav-tabs > li.active > .tabNavContentContainer > .tab_Content > .contentWrapper > .tabNavText {
border-bottom: 0px none transparent;
}
@@ -94,10 +95,10 @@ a:focus {
padding-bottom: @SmallSpace;
.contentWrapper {
.statusIconContainer {
margin-left: 0px;
}
.statusIconContainer {
margin-left: 0px;
}
}
.tabIconSection {
.cancelButton {
@@ -119,7 +120,6 @@ a:focus {
}
}
.resourceTree {
padding: 12px;
}
@@ -156,25 +156,21 @@ a:focus {
}
.selected {
&>.treeNodeHeader {
& > .treeNodeHeader {
background-color: @FabricAccentExtra;
}
}
}
}
.dataExplorerErrorConsoleContainer {
border-radius: 0px 0px @FabricBoxBorderRadius @FabricBoxBorderRadius;
border-radius: 0px 0px @FabricBoxBorderRadius @FabricBoxBorderRadius;
box-shadow: @FabricBoxBorderShadow;
margin: @FabricBoxMargin;
margin-top: 0px;
width: auto;
align-self: auto;
}
.filterbtnstyle {
background: #fff;
color: #000;
@@ -200,12 +196,10 @@ a:focus {
border: solid 1px #d1d1d1;
}
.gridRowSelected .tabdocumentsGridElement:hover {
background-color: @FabricAccentLight !important;
}
.refreshcol {
filter: brightness(0) saturate(100%);
}
@@ -216,4 +210,4 @@ a:focus {
.fileImportImg img {
filter: brightness(0) saturate(100%);
}
}

771
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -5,17 +5,16 @@
"main": "index.js",
"dependencies": {
"@azure/arm-cosmosdb": "9.1.0",
"@azure/cosmos": "4.0.1-beta.3",
"@azure/cosmos": "4.2.0-beta.1",
"@azure/cosmos-language-service": "0.0.5",
"@azure/identity": "1.5.2",
"@azure/ms-rest-nodeauth": "3.1.1",
"@azure/identity": "4.5.0",
"@azure/msal-browser": "2.14.2",
"@babel/plugin-proposal-class-properties": "7.12.1",
"@babel/plugin-proposal-decorators": "7.12.12",
"@fluentui/react": "8.119.0",
"@fluentui/react-components": "9.54.2",
"@jupyterlab/services": "6.0.2",
"@jupyterlab/terminal": "3.0.3",
"@jupyterlab/services": "6.0.2",
"@microsoft/applicationinsights-web": "2.6.1",
"@nteract/commutable": "7.5.1",
"@nteract/connected-components": "6.8.2",
@@ -47,6 +46,7 @@
"@types/mkdirp": "1.0.1",
"@types/node-fetch": "2.5.7",
"@xmldom/xmldom": "0.7.13",
"@xterm/xterm": "5.5.0",
"allotment": "1.20.2",
"applicationinsights": "1.8.0",
"bootstrap": "3.4.1",
@@ -117,7 +117,7 @@
"@babel/preset-env": "7.24.7",
"@babel/preset-react": "7.24.7",
"@babel/preset-typescript": "7.24.7",
"@playwright/test": "1.44.0",
"@playwright/test": "1.49.1",
"@testing-library/react": "11.2.3",
"@types/applicationinsights-js": "1.0.7",
"@types/codemirror": "0.0.56",
@@ -170,10 +170,10 @@
"jest": "29.7.0",
"jest-canvas-mock": "2.5.2",
"jest-circus": "29.7.0",
"jest-environment-jsdom": "29.7.0",
"jest-html-loader": "1.0.0",
"jest-react-hooks-shallow": "1.5.1",
"jest-trx-results-processor": "3.0.2",
"jest-environment-jsdom": "29.7.0",
"less": "3.8.1",
"less-loader": "11.1.3",
"less-vars-loader": "1.1.0",
@@ -247,4 +247,4 @@
"printWidth": 120,
"endOfLine": "auto"
}
}
}

View File

@@ -1,7 +1,7 @@
[defaults]
group = stfaul
sku = P1v2
appserviceplan = stfaul_asp_Linux_centralus_0
location = centralus
web = cosmos-explorer-preview
group = dataexplorer-preview
sku = P1V2
appserviceplan = dataexplorer-preview
location = westus2
web = dataexplorer-preview

View File

@@ -4,8 +4,8 @@ Cosmos Explorer Preview makes it possible to try a working version of any commit
Initial support is for Hosted (Connection string only) or the Azure Portal. Examples:
Connection string URLs: https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html
Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://cosmos-explorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home
Connection string URLs: https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/hostedExplorer.html
Portal URLs: https://ms.portal.azure.com/?dataExplorerSource=https://dataexplorer-preview.azurewebsites.net/commit/COMMIT_SHA/explorer.html#home
In both cases replace `COMMIT_SHA` with the commit you want to view. It must have already completed its build on GitHub Actions.

View File

@@ -1,4 +1,4 @@
{
"PROXY_PATH": "/proxy",
"msalRedirectURI": "https://cosmos-explorer-preview.azurewebsites.net/"
"msalRedirectURI": "https://dataexplorer-preview.azurewebsites.net/"
}

View File

@@ -3,8 +3,15 @@ const { createProxyMiddleware } = require("http-proxy-middleware");
const port = process.env.PORT || 3000;
const fetch = require("node-fetch");
const api = createProxyMiddleware("/api", {
target: "https://main.documentdb.ext.azure.com",
const backendEndpoint = "https://cdb-ms-mpac-pbe.cosmos.azure.com";
const previewSiteEndpoint = "https://dataexplorer-preview.azurewebsites.net";
const previewStorageWebsiteEndpoint = "https://dataexplorerpreview.z5.web.core.windows.net/";
const githubApiUrl = "https://api.github.com/repos/Azure/cosmos-explorer";
const githubPullRequestUrl = "https://github.com/Azure/cosmos-explorer/pull";
const azurePortalMpacEndpoint = "https://ms.portal.azure.com/";
const api = createProxyMiddleware({
target: backendEndpoint,
changeOrigin: true,
logLevel: "debug",
bypass: (req, res) => {
@@ -15,8 +22,8 @@ const api = createProxyMiddleware("/api", {
},
});
const proxy = createProxyMiddleware("/proxy", {
target: "https://main.documentdb.ext.azure.com",
const proxy = createProxyMiddleware({
target: backendEndpoint,
changeOrigin: true,
secure: false,
logLevel: "debug",
@@ -27,35 +34,38 @@ const proxy = createProxyMiddleware("/proxy", {
},
});
const commit = createProxyMiddleware("/commit", {
target: "https://cosmosexplorerpreview.blob.core.windows.net",
const commit = createProxyMiddleware({
target: previewStorageWebsiteEndpoint,
changeOrigin: true,
secure: false,
logLevel: "debug",
pathRewrite: { "^/commit": "$web/" },
pathRewrite: { "^/commit": "/" },
});
const app = express();
app.use(api);
app.use(proxy);
app.use(commit);
app.use("/api", api);
app.use("/proxy", proxy);
app.use("/commit", commit);
app.get("/pull/:pr(\\d+)", (req, res) => {
const pr = req.params.pr;
if (!/^\d+$/.test(pr)) {
return res.status(400).send("Invalid pull request number");
}
const [, query] = req.originalUrl.split("?");
const search = new URLSearchParams(query);
fetch("https://api.github.com/repos/Azure/cosmos-explorer/pulls/" + pr)
fetch(`${githubApiUrl}/pulls/${pr}`)
.then((response) => response.json())
.then(({ head: { ref, sha } }) => {
const prUrl = new URL("https://github.com/Azure/cosmos-explorer/pull/" + pr);
const prUrl = new URL(`${githubPullRequestUrl}/${pr}`);
prUrl.hash = ref;
search.set("feature.pr", prUrl.href);
const explorer = new URL("https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/explorer.html");
const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/explorer.html`);
explorer.search = search.toString();
const portal = new URL("https://ms.portal.azure.com/");
const portal = new URL(azurePortalMpacEndpoint);
portal.searchParams.set("dataExplorerSource", explorer.href);
return res.redirect(portal.href);
@@ -63,12 +73,10 @@ app.get("/pull/:pr(\\d+)", (req, res) => {
.catch(() => res.sendStatus(500));
});
app.get("/", (req, res) => {
fetch("https://api.github.com/repos/Azure/cosmos-explorer/branches/master")
fetch(`${githubApiUrl}/branches/master`)
.then((response) => response.json())
.then(({ commit: { sha } }) => {
const explorer = new URL(
"https://cosmos-explorer-preview.azurewebsites.net/commit/" + sha + "/hostedExplorer.html"
);
const explorer = new URL(`${previewSiteEndpoint}/commit/${sha}/hostedExplorer.html`);
return res.redirect(explorer.href);
})
.catch(() => res.sendStatus(500));

1360
preview/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@
"description": "",
"main": "index.js",
"scripts": {
"deploy": "az webapp up --name \"cosmos-explorer-preview\" --subscription \"cosmosdb-portalteam-generaltest-msft\" --resource-group \"stfaul\"",
"deploy": "az webapp up --name \"dataexplorer-preview\" --subscription \"cosmosdb-portalteam-runners\" --resource-group \"dataexplorer-preview\" --runtime \"NODE:18-lts\" --sku P1V2",
"start": "node index.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
@@ -12,7 +12,8 @@
"author": "Microsoft Corporation",
"dependencies": {
"express": "^4.17.1",
"http-proxy-middleware": "^1.1.0",
"http-proxy-middleware": "^3.0.3",
"node": "^18.20.6",
"node-fetch": "^2.6.1"
}
}
}

View File

@@ -89,6 +89,7 @@ export class CapabilityNames {
public static readonly EnableMongo: string = "EnableMongo";
public static readonly EnableServerless: string = "EnableServerless";
public static readonly EnableNoSQLVectorSearch: string = "EnableNoSQLVectorSearch";
public static readonly EnableNoSQLFullTextSearch: string = "EnableNoSQLFullTextSearch";
}
export enum CapacityMode {
@@ -96,6 +97,12 @@ export enum CapacityMode {
Serverless = "Serverless",
}
export enum WorkloadType {
Learning = "Learning",
DevelopmentTesting = "Development/Testing",
Production = "Production",
None = "None",
}
// flight names returned from the portal are always lowercase
export class Flights {
public static readonly SettingsV2 = "settingsv2";
@@ -118,6 +125,7 @@ export class AfecFeatures {
export class TagNames {
public static defaultExperience: string = "defaultExperience";
public static WorkloadType: string = "hidden-workload-type";
}
export class MongoDBAccounts {
@@ -148,13 +156,25 @@ export class PortalBackendEndpoints {
}
export class MongoProxyEndpoints {
public static readonly Local: string = "https://localhost:7238";
public static readonly Development: string = "https://localhost:7238";
public static readonly Mpac: string = "https://cdb-ms-mpac-mp.cosmos.azure.com";
public static readonly Prod: string = "https://cdb-ms-prod-mp.cosmos.azure.com";
public static readonly Fairfax: string = "https://cdb-ff-prod-mp.cosmos.azure.us";
public static readonly Mooncake: string = "https://cdb-mc-prod-mp.cosmos.azure.cn";
}
export class MongoProxyApi {
public static readonly ResourceList: string = "ResourceList";
public static readonly QueryDocuments: string = "QueryDocuments";
public static readonly CreateDocument: string = "CreateDocument";
public static readonly ReadDocument: string = "ReadDocument";
public static readonly UpdateDocument: string = "UpdateDocument";
public static readonly DeleteDocument: string = "DeleteDocument";
public static readonly CreateCollectionWithProxy: string = "CreateCollectionWithProxy";
public static readonly LegacyMongoShell: string = "LegacyMongoShell";
public static readonly BulkDelete: string = "BulkDelete";
}
export class CassandraProxyEndpoints {
public static readonly Development: string = "https://localhost:7240";
public static readonly Mpac: string = "https://cdb-ms-mpac-cp.cosmos.azure.com";
@@ -505,6 +525,11 @@ export class PriorityLevel {
public static readonly Default = "low";
}
export class ariaLabelForLearnMoreLink {
public static readonly AnalyticalStore = "Learn more about analytical store.";
public static readonly AzureSynapseLink = "Learn more about Azure Synapse Link.";
}
export const QueryCopilotSampleDatabaseId = "CopilotSampleDB";
export const QueryCopilotSampleContainerId = "SampleContainer";

View File

@@ -1,4 +1,5 @@
import { Platform, resetConfigContext, updateConfigContext } from "../ConfigContext";
import { PortalBackendEndpoints } from "Common/Constants";
import { configContext, Platform, resetConfigContext, updateConfigContext } from "../ConfigContext";
import { updateUserContext } from "../UserContext";
import { endpoint, getTokenFromAuthService, requestPlugin } from "./CosmosClient";
@@ -20,22 +21,22 @@ describe("getTokenFromAuthService", () => {
it("builds the correct URL in production", () => {
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Prod,
});
getTokenFromAuthService("GET", "dbs", "foo");
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/guest/runtimeproxy/authorizationTokens",
`${configContext.PORTAL_BACKEND_ENDPOINT}/api/connectionstring/runtimeproxy/authorizationtokens`,
expect.any(Object),
);
});
it("builds the correct URL in dev", () => {
updateConfigContext({
BACKEND_ENDPOINT: "https://localhost:1234",
PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Development,
});
getTokenFromAuthService("GET", "dbs", "foo");
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/guest/runtimeproxy/authorizationTokens",
`${configContext.PORTAL_BACKEND_ENDPOINT}/api/connectionstring/runtimeproxy/authorizationtokens`,
expect.any(Object),
);
});
@@ -78,7 +79,7 @@ describe("requestPlugin", () => {
const next = jest.fn();
updateConfigContext({
platform: Platform.Hosted,
BACKEND_ENDPOINT: "https://localhost:1234",
PORTAL_BACKEND_ENDPOINT: "https://localhost:1234",
PROXY_PATH: "/proxy",
});
const headers = {};

View File

@@ -3,12 +3,11 @@ import { getAuthorizationTokenUsingResourceTokens } from "Common/getAuthorizatio
import { AuthorizationToken } from "Contracts/FabricMessageTypes";
import { checkDatabaseResourceTokensValidity } from "Platform/Fabric/FabricUtil";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { AuthType } from "../AuthType";
import { BackendApi, PriorityLevel } from "../Common/Constants";
import { PriorityLevel } from "../Common/Constants";
import * as Logger from "../Common/Logger";
import { Platform, configContext } from "../ConfigContext";
import { userContext } from "../UserContext";
import { updateUserContext, userContext } from "../UserContext";
import { logConsoleError } from "../Utils/NotificationConsoleUtils";
import * as PriorityBasedExecutionUtils from "../Utils/PriorityBasedExecutionUtils";
import { EmulatorMasterKey, HttpHeaders } from "./Constants";
@@ -27,7 +26,7 @@ export const tokenProvider = async (requestInfo: Cosmos.RequestInfo) => {
);
if (!userContext.aadToken) {
logConsoleError(
`AAD token does not exist. Please click on "Login for Entra ID" button prior to performing Entra ID RBAC operations`,
`AAD token does not exist. Please use the "Login for Entra ID" button in the Toolbar prior to performing Entra ID RBAC operations`,
);
return null;
}
@@ -125,10 +124,6 @@ export async function getTokenFromAuthService(
resourceType: string,
resourceId?: string,
): Promise<AuthorizationToken> {
if (!useNewPortalBackendEndpoint(BackendApi.RuntimeProxy)) {
return getTokenFromAuthService_ToBeDeprecated(verb, resourceType, resourceId);
}
try {
const host: string = configContext.PORTAL_BACKEND_ENDPOINT;
const response: Response = await _global.fetch(host + "/api/connectionstring/runtimeproxy/authorizationtokens", {
@@ -151,34 +146,6 @@ export async function getTokenFromAuthService(
}
}
export async function getTokenFromAuthService_ToBeDeprecated(
verb: string,
resourceType: string,
resourceId?: string,
): Promise<AuthorizationToken> {
try {
const host = configContext.BACKEND_ENDPOINT;
const response = await _global.fetch(host + "/api/guest/runtimeproxy/authorizationTokens", {
method: "POST",
headers: {
"content-type": "application/json",
"x-ms-encrypted-auth-token": userContext.accessToken,
},
body: JSON.stringify({
verb,
resourceType,
resourceId,
}),
});
//TODO I am not sure why we have to parse the JSON again here. fetch should do it for us when we call .json()
const result = JSON.parse(await response.json());
return result;
} catch (error) {
logConsoleError(`Failed to get authorization headers for ${resourceType}: ${getErrorMessage(error)}`);
return Promise.reject(error);
}
}
// The Capability is a bitmap, which cosmosdb backend decodes as per the below enum
enum SDKSupportedCapabilities {
None = 0,
@@ -189,13 +156,24 @@ let _client: Cosmos.CosmosClient;
export function client(): Cosmos.CosmosClient {
if (_client) {
if (!userContext.hasDataPlaneRbacSettingChanged) {
if (!userContext.refreshCosmosClient) {
return _client;
}
_client.dispose();
_client = null;
}
if (userContext.refreshCosmosClient) {
updateUserContext({
refreshCosmosClient: false,
});
}
let _defaultHeaders: Cosmos.CosmosHeaders = {};
_defaultHeaders["x-ms-cosmos-sdk-supportedcapabilities"] =
SDKSupportedCapabilities.None | SDKSupportedCapabilities.PartitionMerge;
_defaultHeaders["x-ms-cosmos-throughput-bucket"] = 1;
if (
userContext.authType === AuthType.ConnectionString ||

View File

@@ -0,0 +1,34 @@
import { WorkloadType } from "Common/Constants";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { DatabaseAccount, Tags } from "Contracts/DataModels";
import { updateUserContext } from "UserContext";
describe("Database Account Utility", () => {
describe("Workload Type", () => {
beforeEach(() => {
updateUserContext({
databaseAccount: {
tags: {} as Tags,
} as DatabaseAccount,
});
});
it("Workload Type should return Learning", () => {
updateUserContext({
databaseAccount: {
tags: {
"hidden-workload-type": WorkloadType.Learning,
} as Tags,
} as DatabaseAccount,
});
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.Learning);
});
it("Workload Type should return None", () => {
const workloadType: WorkloadType = getWorkloadType();
expect(workloadType).toBe(WorkloadType.None);
});
});
});

View File

@@ -1,3 +1,5 @@
import { TagNames, WorkloadType } from "Common/Constants";
import { Tags } from "Contracts/DataModels";
import { userContext } from "../UserContext";
function isVirtualNetworkFilterEnabled() {
@@ -15,3 +17,12 @@ function isPrivateEndpointConnectionsEnabled() {
export function isPublicInternetAccessAllowed(): boolean {
return !isVirtualNetworkFilterEnabled() && !isIpRulesEnabled() && !isPrivateEndpointConnectionsEnabled();
}
export function getWorkloadType(): WorkloadType {
const tags: Tags = userContext?.databaseAccount?.tags;
const workloadType: WorkloadType = tags && (tags[TagNames.WorkloadType] as WorkloadType);
if (!workloadType) {
return WorkloadType.None;
}
return workloadType;
}

View File

@@ -0,0 +1,3 @@
export function getNewDatabaseSharedThroughputDefault(): boolean {
return false;
}

View File

@@ -10,6 +10,7 @@ export interface TableEntityProps {
isEntityValueDisable?: boolean;
entityTimeValue: string;
entityValueType: string;
entityProperty: string;
onEntityValueChange: (event: React.FormEvent<HTMLElement>, newInput?: string) => void;
onSelectDate: (date: Date | null | undefined) => void;
onEntityTimeValueChange: (event: React.FormEvent<HTMLElement>, newInput?: string) => void;
@@ -26,6 +27,7 @@ export const EntityValue: FunctionComponent<TableEntityProps> = ({
onSelectDate,
isEntityValueDisable,
onEntityTimeValueChange,
entityProperty,
}: TableEntityProps): JSX.Element => {
if (isEntityTypeDate) {
return (
@@ -51,15 +53,20 @@ export const EntityValue: FunctionComponent<TableEntityProps> = ({
}
return (
<TextField
label={entityValueLabel && entityValueLabel}
className="addEntityTextField"
disabled={isEntityValueDisable}
type={entityValueType}
placeholder={entityValuePlaceholder}
value={typeof entityValue === "string" ? entityValue : ""}
onChange={onEntityValueChange}
ariaLabel={attributeValueLabel}
/>
<>
<span id={entityProperty} className="screenReaderOnly">
Edit Property {entityProperty} {attributeValueLabel}
</span>
<TextField
label={entityValueLabel && entityValueLabel}
className="addEntityTextField"
disabled={isEntityValueDisable}
type={entityValueType}
placeholder={entityValuePlaceholder}
value={typeof entityValue === "string" ? entityValue : ""}
onChange={onEntityValueChange}
aria-labelledby={entityProperty}
/>
</>
);
};

View File

@@ -1,18 +1,11 @@
import { MongoProxyEndpoints } from "Common/Constants";
import { AuthType } from "../AuthType";
import { resetConfigContext, updateConfigContext } from "../ConfigContext";
import { configContext, resetConfigContext, updateConfigContext } from "../ConfigContext";
import { DatabaseAccount } from "../Contracts/DataModels";
import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId";
import { extractFeatures } from "../Platform/Hosted/extractFeatures";
import { updateUserContext } from "../UserContext";
import {
deleteDocument,
getEndpoint,
getFeatureEndpointOrDefault,
queryDocuments,
readDocument,
updateDocument,
} from "./MongoProxyClient";
import { deleteDocuments, getEndpoint, queryDocuments, readDocument, updateDocument } from "./MongoProxyClient";
const databaseId = "testDB";
@@ -71,7 +64,8 @@ describe("MongoProxyClient", () => {
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
window.fetch = jest.fn().mockImplementation(fetchMock);
});
@@ -82,16 +76,19 @@ describe("MongoProxyClient", () => {
it("builds the correct URL", () => {
queryDocuments(databaseId, collection, true, "{}");
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/mongo/explorer/resourcelist?db=testDB&coll=testCollection&resourceUrl=bardbs%2FtestDB%2Fcolls%2FtestCollection%2Fdocs%2F&rid=testCollectionrid&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/resourcelist`,
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({ MONGO_BACKEND_ENDPOINT: "https://localhost:1234" });
updateConfigContext({
MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
queryDocuments(databaseId, collection, true, "{}");
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/mongo/explorer/resourcelist?db=testDB&coll=testCollection&resourceUrl=bardbs%2FtestDB%2Fcolls%2FtestCollection%2Fdocs%2F&rid=testCollectionrid&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/resourcelist`,
expect.any(Object),
);
});
@@ -103,7 +100,8 @@ describe("MongoProxyClient", () => {
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
window.fetch = jest.fn().mockImplementation(fetchMock);
});
@@ -114,16 +112,19 @@ describe("MongoProxyClient", () => {
it("builds the correct URL", () => {
readDocument(databaseId, collection, documentId);
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({ MONGO_BACKEND_ENDPOINT: "https://localhost:1234" });
updateConfigContext({
MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
readDocument(databaseId, collection, documentId);
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
@@ -135,7 +136,8 @@ describe("MongoProxyClient", () => {
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
window.fetch = jest.fn().mockImplementation(fetchMock);
});
@@ -146,16 +148,19 @@ describe("MongoProxyClient", () => {
it("builds the correct URL", () => {
readDocument(databaseId, collection, documentId);
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({ MONGO_BACKEND_ENDPOINT: "https://localhost:1234" });
updateConfigContext({
MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
readDocument(databaseId, collection, documentId);
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
@@ -167,7 +172,8 @@ describe("MongoProxyClient", () => {
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
window.fetch = jest.fn().mockImplementation(fetchMock);
});
@@ -178,28 +184,20 @@ describe("MongoProxyClient", () => {
it("builds the correct URL", () => {
updateDocument(databaseId, collection, documentId, "{}");
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2Fdocs%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({ MONGO_BACKEND_ENDPOINT: "https://localhost:1234" });
updateDocument(databaseId, collection, documentId, "{}");
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2Fdocs%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`,
expect.any(Object),
);
});
});
describe("deleteDocument", () => {
describe("deleteDocuments", () => {
beforeEach(() => {
resetConfigContext();
updateUserContext({
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
window.fetch = jest.fn().mockImplementation(fetchMock);
});
@@ -208,18 +206,21 @@ describe("MongoProxyClient", () => {
});
it("builds the correct URL", () => {
deleteDocument(databaseId, collection, documentId);
deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith(
"https://main.documentdb.ext.azure.com/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2Fdocs%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object),
);
});
it("builds the correct proxy URL in development", () => {
updateConfigContext({ MONGO_BACKEND_ENDPOINT: "https://localhost:1234" });
deleteDocument(databaseId, collection, documentId);
updateConfigContext({
MONGO_PROXY_ENDPOINT: "https://localhost:1234",
globallyEnabledMongoAPIs: [],
});
deleteDocuments(databaseId, collection, [documentId]);
expect(window.fetch).toHaveBeenCalledWith(
"https://localhost:1234/api/mongo/explorer?db=testDB&coll=testCollection&resourceUrl=bardb%2FtestDB%2Fdb%2FtestCollection%2Fdocs%2FtestId&rid=testId&rtype=docs&sid=&rg=&dba=foo&pk=pk",
`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer/bulkdelete`,
expect.any(Object),
);
});
@@ -231,13 +232,14 @@ describe("MongoProxyClient", () => {
databaseAccount,
});
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
globallyEnabledMongoAPIs: [],
});
});
it("returns a production endpoint", () => {
const endpoint = getEndpoint("https://main.documentdb.ext.azure.com");
expect(endpoint).toEqual("https://main.documentdb.ext.azure.com/api/mongo/explorer");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/mongo/explorer`);
});
it("returns a development endpoint", () => {
@@ -249,35 +251,8 @@ describe("MongoProxyClient", () => {
updateUserContext({
authType: AuthType.EncryptedToken,
});
const endpoint = getEndpoint("https://main.documentdb.ext.azure.com");
expect(endpoint).toEqual("https://main.documentdb.ext.azure.com/api/guest/mongo/explorer");
});
});
describe("getFeatureEndpointOrDefault", () => {
beforeEach(() => {
resetConfigContext();
updateConfigContext({
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
});
const params = new URLSearchParams({
"feature.mongoProxyEndpoint": "https://localhost:12901",
"feature.mongoProxyAPIs": "readDocument|createDocument",
});
const features = extractFeatures(params);
updateUserContext({
authType: AuthType.AAD,
features: features,
});
});
it("returns a local endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("readDocument");
expect(endpoint).toEqual("https://localhost:12901/api/mongo/explorer");
});
it("returns a production endpoint", () => {
const endpoint = getFeatureEndpointOrDefault("deleteDocument");
expect(endpoint).toEqual("https://main.documentdb.ext.azure.com/api/mongo/explorer");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
expect(endpoint).toEqual(`${configContext.MONGO_PROXY_ENDPOINT}/api/connectionstring/mongo/explorer`);
});
});
});

View File

@@ -1,20 +1,13 @@
import { Constants as CosmosSDKConstants } from "@azure/cosmos";
import {
allowedMongoProxyEndpoints,
allowedMongoProxyEndpoints_ToBeDeprecated,
validateEndpoint,
} from "Utils/EndpointUtils";
import queryString from "querystring";
import { AuthType } from "../AuthType";
import { configContext } from "../ConfigContext";
import * as DataModels from "../Contracts/DataModels";
import { MessageTypes } from "../Contracts/ExplorerContracts";
import { Collection } from "../Contracts/ViewModels";
import DocumentId from "../Explorer/Tree/DocumentId";
import { hasFlag } from "../Platform/Hosted/extractFeatures";
import { userContext } from "../UserContext";
import { logConsoleError } from "../Utils/NotificationConsoleUtils";
import { ApiType, ContentType, HttpHeaders, HttpStatusCodes, MongoProxyEndpoints } from "./Constants";
import { ApiType, ContentType, HttpHeaders, HttpStatusCodes } from "./Constants";
import { MinimalQueryIterator } from "./IteratorUtilities";
import { sendMessage } from "./MessageHandler";
@@ -67,10 +60,6 @@ export function queryDocuments(
query: string,
continuationToken?: string,
): Promise<QueryResponse> {
if (!useMongoProxyEndpoint("resourcelist") || !useMongoProxyEndpoint("queryDocuments")) {
return queryDocuments_ToBeDeprecated(databaseId, collection, isResourceList, query, continuationToken);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
@@ -89,7 +78,7 @@ export function queryDocuments(
query,
};
const endpoint = getFeatureEndpointOrDefault("resourcelist") || "";
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT) || "";
const headers = {
...defaultHeaders,
@@ -127,76 +116,11 @@ export function queryDocuments(
});
}
function queryDocuments_ToBeDeprecated(
databaseId: string,
collection: Collection,
isResourceList: boolean,
query: string,
continuationToken?: string,
): Promise<QueryResponse> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
collection && collection.partitionKey && !collection.partitionKey.systemKey
? collection.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("resourcelist") || "";
const headers = {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.IsQuery]: "true",
[CosmosSDKConstants.HttpHeaders.PopulateQueryMetrics]: "true",
[CosmosSDKConstants.HttpHeaders.EnableScanInQuery]: "true",
[CosmosSDKConstants.HttpHeaders.EnableCrossPartitionQuery]: "true",
[CosmosSDKConstants.HttpHeaders.ParallelizeCrossPartitionQuery]: "true",
[HttpHeaders.contentType]: "application/query+json",
};
if (continuationToken) {
headers[CosmosSDKConstants.HttpHeaders.Continuation] = continuationToken;
}
const path = isResourceList ? "/resourcelist" : "";
return window
.fetch(`${endpoint}${path}?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify({ query }),
headers,
})
.then(async (response) => {
if (response.ok) {
return {
continuationToken: response.headers.get(CosmosSDKConstants.HttpHeaders.Continuation),
documents: (await response.json()).Documents as DataModels.DocumentId[],
headers: response.headers,
};
}
await errorHandling(response, "querying documents", params);
return undefined;
});
}
export function readDocument(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint("readDocument")) {
return readDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
@@ -217,7 +141,7 @@ export function readDocument(
: "",
};
const endpoint = getFeatureEndpointOrDefault("readDocument");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(endpoint, {
@@ -237,61 +161,12 @@ export function readDocument(
});
}
export function readDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 4).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("readDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "GET",
headers: {
...defaultHeaders,
...authHeaders(),
[CosmosSDKConstants.HttpHeaders.PartitionKey]: encodeURIComponent(
JSON.stringify(documentId.partitionKeyHeader()),
),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "reading document", params);
});
}
export function createDocument(
databaseId: string,
collection: Collection,
partitionKeyProperty: string,
documentContent: unknown,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint("createDocument")) {
return createDocument_ToBeDeprecated(databaseId, collection, partitionKeyProperty, documentContent);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
@@ -308,7 +183,7 @@ export function createDocument(
documentContent: JSON.stringify(documentContent),
};
const endpoint = getFeatureEndpointOrDefault("createDocument");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/createDocument`, {
@@ -328,54 +203,12 @@ export function createDocument(
});
}
export function createDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
partitionKeyProperty: string,
documentContent: unknown,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}dbs/${databaseId}/colls/${collection.id()}/docs/`,
rid: collection.rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk: collection && collection.partitionKey && !collection.partitionKey.systemKey ? partitionKeyProperty : "",
};
const endpoint = getFeatureEndpointOrDefault("createDocument");
return window
.fetch(`${endpoint}/resourcelist?${queryString.stringify(params)}`, {
method: "POST",
body: JSON.stringify(documentContent),
headers: {
...defaultHeaders,
...authHeaders(),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating document", params);
});
}
export function updateDocument(
databaseId: string,
collection: Collection,
documentId: DocumentId,
documentContent: string,
): Promise<DataModels.DocumentId> {
if (!useMongoProxyEndpoint("updateDocument")) {
return updateDocument_ToBeDeprecated(databaseId, collection, documentId, documentContent);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
@@ -396,7 +229,7 @@ export function updateDocument(
: "",
documentContent,
};
const endpoint = getFeatureEndpointOrDefault("updateDocument");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(endpoint, {
@@ -417,139 +250,6 @@ export function updateDocument(
});
}
export function updateDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
documentContent: string,
): Promise<DataModels.DocumentId> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("updateDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "PUT",
body: documentContent,
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "updating document", params);
});
}
export function deleteDocument(databaseId: string, collection: Collection, documentId: DocumentId): Promise<void> {
if (!useMongoProxyEndpoint("deleteDocument")) {
return deleteDocument_ToBeDeprecated(databaseId, collection, documentId);
}
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
databaseID: databaseId,
collectionID: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
resourceID: rid,
resourceType: "docs",
subscriptionID: userContext.subscriptionId,
resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name,
partitionKey:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("deleteDocument");
return window
.fetch(endpoint, {
method: "DELETE",
body: JSON.stringify(params),
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocument_ToBeDeprecated(
databaseId: string,
collection: Collection,
documentId: DocumentId,
): Promise<void> {
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const idComponents = documentId.self.split("/");
const path = idComponents.slice(0, 5).join("/");
const rid = encodeURIComponent(idComponents[5]);
const params = {
db: databaseId,
coll: collection.id(),
resourceUrl: `${resourceEndpoint}${path}/${rid}`,
rid,
rtype: "docs",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
pk:
documentId && documentId.partitionKey && !documentId.partitionKey.systemKey
? documentId.partitionKeyProperties?.[0]
: "",
};
const endpoint = getFeatureEndpointOrDefault("deleteDocument");
return window
.fetch(`${endpoint}?${queryString.stringify(params)}`, {
method: "DELETE",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: ContentType.applicationJson,
[CosmosSDKConstants.HttpHeaders.PartitionKey]: JSON.stringify(documentId.partitionKeyHeader()),
},
})
.then(async (response) => {
if (response.ok) {
return undefined;
}
return await errorHandling(response, "deleting document", params);
});
}
export function deleteDocuments(
databaseId: string,
collection: Collection,
@@ -561,7 +261,10 @@ export function deleteDocuments(
const { databaseAccount } = userContext;
const resourceEndpoint = databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint;
const rids = documentIds.map((documentId) => documentId.id());
const rids: string[] = documentIds.map((documentId) => {
const idComponents = documentId.self.split("/");
return idComponents[5];
});
const params = {
databaseID: databaseId,
@@ -572,7 +275,7 @@ export function deleteDocuments(
resourceGroup: userContext.resourceGroup,
databaseAccountName: databaseAccount.name,
};
const endpoint = getFeatureEndpointOrDefault("bulkdelete");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/bulkdelete`, {
@@ -596,9 +299,6 @@ export function deleteDocuments(
export function createMongoCollectionWithProxy(
params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> {
if (!useMongoProxyEndpoint("createCollectionWithProxy")) {
return createMongoCollectionWithProxy_ToBeDeprecated(params);
}
const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0];
@@ -619,7 +319,7 @@ export function createMongoCollectionWithProxy(
isSharded: !!shardKey,
};
const endpoint = getFeatureEndpointOrDefault("createCollectionWithProxy");
const endpoint = getEndpoint(configContext.MONGO_PROXY_ENDPOINT);
return window
.fetch(`${endpoint}/createCollection`, {
@@ -639,69 +339,6 @@ export function createMongoCollectionWithProxy(
});
}
export function createMongoCollectionWithProxy_ToBeDeprecated(
params: DataModels.CreateCollectionParams,
): Promise<DataModels.Collection> {
const { databaseAccount } = userContext;
const shardKey: string = params.partitionKey?.paths[0];
const mongoParams: DataModels.MongoParameters = {
resourceUrl: databaseAccount.properties.mongoEndpoint || databaseAccount.properties.documentEndpoint,
db: params.databaseId,
coll: params.collectionId,
pk: shardKey,
offerThroughput: params.autoPilotMaxThroughput || params.offerThroughput,
cd: params.createNewDatabase,
st: params.databaseLevelThroughput,
is: !!shardKey,
rid: "",
rtype: "colls",
sid: userContext.subscriptionId,
rg: userContext.resourceGroup,
dba: databaseAccount.name,
isAutoPilot: !!params.autoPilotMaxThroughput,
};
const endpoint = getFeatureEndpointOrDefault("createCollectionWithProxy");
return window
.fetch(
`${endpoint}/createCollection?${queryString.stringify(
mongoParams as unknown as queryString.ParsedUrlQueryInput,
)}`,
{
method: "POST",
headers: {
...defaultHeaders,
...authHeaders(),
[HttpHeaders.contentType]: "application/json",
},
},
)
.then(async (response) => {
if (response.ok) {
return response.json();
}
return await errorHandling(response, "creating collection", mongoParams);
});
}
export function getFeatureEndpointOrDefault(feature: string): string {
let endpoint;
if (useMongoProxyEndpoint(feature)) {
endpoint = configContext.MONGO_PROXY_ENDPOINT;
} else {
endpoint =
hasFlag(userContext.features.mongoProxyAPIs, feature) &&
validateEndpoint(userContext.features.mongoProxyEndpoint, [
...allowedMongoProxyEndpoints,
...allowedMongoProxyEndpoints_ToBeDeprecated,
])
? userContext.features.mongoProxyEndpoint
: configContext.MONGO_BACKEND_ENDPOINT || configContext.BACKEND_ENDPOINT;
}
return getEndpoint(endpoint);
}
export function getEndpoint(endpoint: string): string {
let url = endpoint + "/api/mongo/explorer";
@@ -715,21 +352,6 @@ export function getEndpoint(endpoint: string): string {
return url;
}
export function useMongoProxyEndpoint(api: string): boolean {
const activeMongoProxyEndpoints: string[] = [
MongoProxyEndpoints.Local,
MongoProxyEndpoints.Mpac,
MongoProxyEndpoints.Prod,
MongoProxyEndpoints.Fairfax,
MongoProxyEndpoints.Mooncake,
];
return (
configContext.NEW_MONGO_APIS?.includes(api) &&
activeMongoProxyEndpoints.includes(configContext.MONGO_PROXY_ENDPOINT)
);
}
export class ThrottlingError extends Error {
constructor(message: string) {
super(message);

View File

@@ -36,7 +36,7 @@ describe("QueryError.tryParse", () => {
code: "BadRequest",
message: "Your query is bad, and you should feel bad",
};
const message = JSON.stringify(innerError);
const message = `Message: ${JSON.stringify(innerError)}\r\nActivity ID: 42`;
const outerError = {
code: "BadRequest",
message,
@@ -48,7 +48,7 @@ describe("QueryError.tryParse", () => {
]);
});
// Imitate the value coming from the backend, which has the syntax errors serialized as JSON in the message.
// Imitate the value coming from the backend, which has the syntax errors serialized as JSON in the message, along with a prefix and activity id.
it("handles single-nested error", () => {
const errors = [
{
@@ -69,7 +69,7 @@ describe("QueryError.tryParse", () => {
message: "Your query is bad, and you should feel bad",
errors,
};
const message = JSON.stringify(innerError);
const message = `Message: ${JSON.stringify(innerError)}\r\nActivity ID: 42`;
const outerError = {
code: "BadRequest",
message,
@@ -91,4 +91,23 @@ describe("QueryError.tryParse", () => {
),
]);
});
// Imitate another value we've gotten from the backend, which has a doubly-nested JSON payload.
it("handles double-nested error", () => {
const outerError = {
code: "BadRequest",
message:
'{"code":"BadRequest","message":"{\\"errors\\":[{\\"severity\\":\\"Error\\",\\"location\\":{\\"start\\":7,\\"end\\":18},\\"code\\":\\"SC2005\\",\\"message\\":\\"\'nonexistent\' is not a recognized built-in function name.\\"}]}\\r\\nActivityId: aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa, Windows/10.0.20348 cosmos-netstandard-sdk/3.18.0"}',
};
const result = QueryError.tryParse(outerError, testErrorLocationResolver);
expect(result).toEqual([
new QueryError(
"'nonexistent' is not a recognized built-in function name.",
QueryErrorSeverity.Error,
"SC2005",
new QueryErrorLocation({ offset: 7, lineNumber: 7, column: 7 }, { offset: 18, lineNumber: 18, column: 18 }),
),
]);
});
});

View File

@@ -214,16 +214,28 @@ export default class QueryError {
return null;
}
// Assign to a new variable because of a TypeScript flow typing quirk, see below.
if (message.startsWith("Message: ")) {
// Reassigning this to 'error' restores the original type of 'error', which is 'unknown'.
// So we use a separate variable to avoid this.
message = message.substring("Message: ".length);
// Some newer backends produce a message that contains a doubly-nested JSON payload.
// In this case, the message we get is a fully-complete JSON object we can parse.
// So let's try that first
if (message.startsWith("{") && message.endsWith("}")) {
let outer: unknown = undefined;
try {
outer = JSON.parse(message);
if (typeof outer === "object" && "message" in outer && typeof outer.message === "string") {
message = outer.message;
}
} catch (e) {
// Just continue if the parsing fails. We'll use the fallback logic below.
}
}
const lines = message.split("\n");
message = lines[0].trim();
if (message.startsWith("Message: ")) {
message = message.substring("Message: ".length);
}
let parsed: unknown;
try {
parsed = JSON.parse(message);

View File

@@ -135,6 +135,7 @@ export const TableEntity: FunctionComponent<TableEntityProps> = ({
onEntityValueChange={onEntityValueChange}
onSelectDate={onSelectDate}
onEntityTimeValueChange={onEntityTimeValueChange}
entityProperty={entityProperty}
/>
{!isEntityValueDisable && (
<TooltipHost content="Edit property" id="editTooltip">

View File

@@ -99,6 +99,9 @@ const createSqlContainer = async (params: DataModels.CreateCollectionParams): Pr
if (params.vectorEmbeddingPolicy) {
resource.vectorEmbeddingPolicy = params.vectorEmbeddingPolicy;
}
if (params.fullTextPolicy) {
resource.fullTextPolicy = params.fullTextPolicy;
}
const rpPayload: ARMTypes.SqlDatabaseCreateUpdateParameters = {
properties: {
@@ -270,6 +273,7 @@ const createCollectionWithSDK = async (params: DataModels.CreateCollectionParams
uniqueKeyPolicy: params.uniqueKeyPolicy || undefined,
analyticalStorageTtl: params.analyticalStorageTtl,
vectorEmbeddingPolicy: params.vectorEmbeddingPolicy,
fullTextPolicy: params.fullTextPolicy,
} as ContainerRequest; // TODO: remove cast when https://github.com/Azure/azure-cosmos-js/issues/423 is fixed
const collectionOptions: RequestOptions = {};
const createDatabaseBody: DatabaseRequest = { id: params.databaseId };

View File

@@ -105,6 +105,8 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
? parseInt(resource.softAllowedMaximumThroughput)
: resource.softAllowedMaximumThroughput;
const throughputBuckets = resource?.throughputBuckets;
if (autoscaleSettings) {
return {
id: offerId,
@@ -114,6 +116,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput,
softAllowedMaximumThroughput,
throughputBuckets,
};
}
@@ -125,6 +128,7 @@ const readCollectionOfferWithARM = async (databaseId: string, collectionId: stri
offerReplacePending: resource.offerReplacePending === "true",
instantMaximumThroughput,
softAllowedMaximumThroughput,
throughputBuckets,
};
}

View File

@@ -1,6 +1,6 @@
import { OfferDefinition, RequestOptions } from "@azure/cosmos";
import { AuthType } from "../../AuthType";
import { Offer, SDKOfferDefinition, UpdateOfferParams } from "../../Contracts/DataModels";
import { Offer, SDKOfferDefinition, ThroughputBucket, UpdateOfferParams } from "../../Contracts/DataModels";
import { userContext } from "../../UserContext";
import {
migrateCassandraKeyspaceToAutoscale,
@@ -359,6 +359,13 @@ const createUpdateOfferBody = (params: UpdateOfferParams): ThroughputSettingsUpd
body.properties.resource.throughput = params.manualThroughput;
}
if (params.throughputBuckets) {
const throughputBuckets = params.throughputBuckets.filter(
(bucket: ThroughputBucket) => bucket.maxThroughputPercentage !== 100,
);
body.properties.resource.throughputBuckets = throughputBuckets;
}
return body;
};

View File

@@ -8,16 +8,15 @@ import {
import {
allowedAadEndpoints,
allowedArcadiaEndpoints,
allowedCassandraProxyEndpoints,
allowedEmulatorEndpoints,
allowedGraphEndpoints,
allowedHostedExplorerEndpoints,
allowedJunoOrigins,
allowedMongoBackendEndpoints,
allowedMongoProxyEndpoints,
allowedMsalRedirectEndpoints,
defaultAllowedArmEndpoints,
defaultAllowedBackendEndpoints,
defaultAllowedCassandraProxyEndpoints,
defaultAllowedMongoProxyEndpoints,
validateEndpoint,
} from "Utils/EndpointUtils";
@@ -32,6 +31,8 @@ export interface ConfigContext {
platform: Platform;
allowedArmEndpoints: ReadonlyArray<string>;
allowedBackendEndpoints: ReadonlyArray<string>;
allowedCassandraProxyEndpoints: ReadonlyArray<string>;
allowedMongoProxyEndpoints: ReadonlyArray<string>;
allowedParentFrameOrigins: ReadonlyArray<string>;
gitSha?: string;
proxyPath?: string;
@@ -48,12 +49,9 @@ export interface ConfigContext {
CATALOG_API_KEY: string;
ARCADIA_ENDPOINT: string;
ARCADIA_LIVY_ENDPOINT_DNS_ZONE: string;
BACKEND_ENDPOINT?: string;
PORTAL_BACKEND_ENDPOINT: string;
NEW_BACKEND_APIS?: BackendApi[];
MONGO_BACKEND_ENDPOINT?: string;
MONGO_PROXY_ENDPOINT: string;
NEW_MONGO_APIS?: string[];
CASSANDRA_PROXY_ENDPOINT: string;
NEW_CASSANDRA_APIS?: string[];
PROXY_PATH?: string;
@@ -66,6 +64,8 @@ export interface ConfigContext {
hostedExplorerURL: string;
armAPIVersion?: string;
msalRedirectURI?: string;
globallyEnabledCassandraAPIs?: string[];
globallyEnabledMongoAPIs?: string[];
}
// Default configuration
@@ -73,9 +73,12 @@ let configContext: Readonly<ConfigContext> = {
platform: Platform.Portal,
allowedArmEndpoints: defaultAllowedArmEndpoints,
allowedBackendEndpoints: defaultAllowedBackendEndpoints,
allowedCassandraProxyEndpoints: defaultAllowedCassandraProxyEndpoints,
allowedMongoProxyEndpoints: defaultAllowedMongoProxyEndpoints,
allowedParentFrameOrigins: [
`^https:\\/\\/cosmos\\.azure\\.(com|cn|us)$`,
`^https:\\/\\/[\\.\\w]*portal\\.azure\\.(com|cn|us)$`,
`^https:\\/\\/cdb-(ms|ff|mc)-prod-pbe\\.cosmos\\.azure\\.(com|us|cn)$`,
`^https:\\/\\/[\\.\\w]*portal\\.microsoftazure\\.de$`,
`^https:\\/\\/[\\.\\w]*ext\\.azure\\.(com|cn|us)$`,
`^https:\\/\\/[\\.\\w]*\\.ext\\.microsoftazure\\.de$`,
@@ -85,7 +88,7 @@ let configContext: Readonly<ConfigContext> = {
`^https:\\/\\/.*\\.analysis-df\\.net$`,
`^https:\\/\\/.*\\.analysis-df\\.windows\\.net$`,
`^https:\\/\\/.*\\.azure-test\\.net$`,
`^https:\\/\\/cosmos-explorer-preview\\.azurewebsites\\.net$`,
`^https:\\/\\/dataexplorer-preview\\.azurewebsites\\.net$`,
], // Webpack injects this at build time
gitSha: process.env.GIT_SHA,
hostedExplorerURL: "https://cosmos.azure.com/",
@@ -103,24 +106,14 @@ let configContext: Readonly<ConfigContext> = {
GITHUB_CLIENT_ID: "6cb2f63cf6f7b5cbdeca", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1189306
GITHUB_TEST_ENV_CLIENT_ID: "b63fc8cbf87fd3c6e2eb", // Registered OAuth app: https://github.com/organizations/AzureCosmosDBNotebooks/settings/applications/1777772
JUNO_ENDPOINT: JunoEndpoints.Prod,
BACKEND_ENDPOINT: "https://main.documentdb.ext.azure.com",
PORTAL_BACKEND_ENDPOINT: PortalBackendEndpoints.Prod,
MONGO_PROXY_ENDPOINT: MongoProxyEndpoints.Prod,
NEW_MONGO_APIS: [
"resourcelist",
"queryDocuments",
"createDocument",
"readDocument",
"updateDocument",
"deleteDocument",
"createCollectionWithProxy",
"legacyMongoShell",
// "bulkdelete",
],
CASSANDRA_PROXY_ENDPOINT: CassandraProxyEndpoints.Prod,
NEW_CASSANDRA_APIS: ["postQuery", "createOrDelete", "getKeys", "getSchema"],
isTerminalEnabled: false,
isPhoenixEnabled: false,
globallyEnabledCassandraAPIs: [],
globallyEnabledMongoAPIs: [],
};
export function resetConfigContext(): void {
@@ -157,22 +150,19 @@ export function updateConfigContext(newContext: Partial<ConfigContext>): void {
if (
!validateEndpoint(
newContext.BACKEND_ENDPOINT,
configContext.allowedBackendEndpoints || defaultAllowedBackendEndpoints,
newContext.MONGO_PROXY_ENDPOINT,
configContext.allowedMongoProxyEndpoints || defaultAllowedMongoProxyEndpoints,
)
) {
delete newContext.BACKEND_ENDPOINT;
}
if (!validateEndpoint(newContext.MONGO_PROXY_ENDPOINT, allowedMongoProxyEndpoints)) {
delete newContext.MONGO_PROXY_ENDPOINT;
}
if (!validateEndpoint(newContext.MONGO_BACKEND_ENDPOINT, allowedMongoBackendEndpoints)) {
delete newContext.MONGO_BACKEND_ENDPOINT;
}
if (!validateEndpoint(newContext.CASSANDRA_PROXY_ENDPOINT, allowedCassandraProxyEndpoints)) {
if (
!validateEndpoint(
newContext.CASSANDRA_PROXY_ENDPOINT,
configContext.allowedCassandraProxyEndpoints || defaultAllowedCassandraProxyEndpoints,
)
) {
delete newContext.CASSANDRA_PROXY_ENDPOINT;
}

View File

@@ -9,6 +9,7 @@ export enum TabKind {
Graph,
SQLQuery,
ScaleSettings,
MongoQuery,
}
/**
@@ -51,6 +52,8 @@ export interface OpenCollectionTab extends OpenTab {
*/
export interface OpenQueryTab extends OpenCollectionTab {
query: QueryInfo;
splitterDirection?: "vertical" | "horizontal";
queryViewSizePercent?: number;
}
/**

View File

@@ -6,6 +6,7 @@ export interface ArmEntity {
location: string;
type: string;
kind: string;
tags?: Tags;
}
export interface DatabaseAccount extends ArmEntity {
@@ -41,6 +42,11 @@ export interface DatabaseAccountExtendedProperties {
publicNetworkAccess?: string;
enablePriorityBasedExecution?: boolean;
vcoreMongoEndpoint?: string;
virtualNetworkRules?: VNetRule[];
}
export interface VNetRule {
id: string;
}
export interface DatabaseAccountResponseLocation {
@@ -159,6 +165,7 @@ export interface Collection extends Resource {
analyticalStorageTtl?: number;
geospatialConfig?: GeospatialConfig;
vectorEmbeddingPolicy?: VectorEmbeddingPolicy;
fullTextPolicy?: FullTextPolicy;
schema?: ISchema;
requestSchema?: () => void;
computedProperties?: ComputedProperties;
@@ -199,11 +206,19 @@ export interface IndexingPolicy {
compositeIndexes?: any[];
spatialIndexes?: any[];
vectorIndexes?: VectorIndex[];
fullTextIndexes?: FullTextIndex[];
}
export interface VectorIndex {
path: string;
type: "flat" | "diskANN" | "quantizedFlat";
diskANNShardKey?: string;
indexingSearchListSize?: number;
quantizationByteSize?: number;
}
export interface FullTextIndex {
path: string;
}
export interface ComputedProperty {
@@ -265,6 +280,12 @@ export interface Offer {
offerReplacePending: boolean;
instantMaximumThroughput?: number;
softAllowedMaximumThroughput?: number;
throughputBuckets?: ThroughputBucket[];
}
export interface ThroughputBucket {
id: number;
maxThroughputPercentage: number;
}
export interface SDKOfferDefinition extends Resource {
@@ -342,6 +363,7 @@ export interface CreateCollectionParams {
uniqueKeyPolicy?: UniqueKeyPolicy;
createMongoWildcardIndex?: boolean;
vectorEmbeddingPolicy?: VectorEmbeddingPolicy;
fullTextPolicy?: FullTextPolicy;
}
export interface VectorEmbeddingPolicy {
@@ -355,6 +377,16 @@ export interface VectorEmbedding {
path: string;
}
export interface FullTextPolicy {
defaultLanguage: string;
fullTextPaths: FullTextPath[];
}
export interface FullTextPath {
path: string;
language: string;
}
export interface ReadDatabaseOfferParams {
databaseId: string;
databaseResourceId?: string;
@@ -376,6 +408,7 @@ export interface UpdateOfferParams {
collectionId?: string;
migrateToAutoPilot?: boolean;
migrateToManual?: boolean;
throughputBuckets?: ThroughputBucket[];
}
export interface Notification {
@@ -643,3 +676,5 @@ export interface FeatureRegistration {
state: string;
};
}
export type Tags = { [key: string]: string };

View File

@@ -41,7 +41,7 @@ export enum MessageTypes {
OpenPostgreSQLPasswordReset,
OpenPostgresNetworkingBlade,
OpenCosmosDBNetworkingBlade,
DisplayNPSSurvey,
DisplayNPSSurvey, // unused
OpenVCoreMongoNetworkingBlade,
OpenVCoreMongoConnectionStringsBlade,
GetAuthorizationToken, // unused. Can be removed if the portal uses the same list of enums.

View File

@@ -115,7 +115,13 @@ export interface CollectionBase extends TreeNode {
isSampleCollection?: boolean;
onDocumentDBDocumentsClick(): void;
onNewQueryClick(source: any, event?: MouseEvent, queryText?: string): void;
onNewQueryClick(
source: any,
event?: MouseEvent,
queryText?: string,
splitterDirection?: "horizontal" | "vertical",
queryViewSizePercent?: number,
): void;
expandCollection(): void;
collapseCollection(): void;
getDatabase(): Database;
@@ -126,6 +132,8 @@ export interface Collection extends CollectionBase {
analyticalStorageTtl: ko.Observable<number>;
schema?: DataModels.ISchema;
requestSchema?: () => void;
vectorEmbeddingPolicy: ko.Observable<DataModels.VectorEmbeddingPolicy>;
fullTextPolicy: ko.Observable<DataModels.FullTextPolicy>;
indexingPolicy: ko.Observable<DataModels.IndexingPolicy>;
uniqueKeyPolicy: DataModels.UniqueKeyPolicy;
usageSizeInKB: ko.Observable<number>;
@@ -149,7 +157,13 @@ export interface Collection extends CollectionBase {
onSettingsClick: () => Promise<void>;
onNewGraphClick(): void;
onNewMongoQueryClick(source: any, event?: MouseEvent, queryText?: string): void;
onNewMongoQueryClick(
source: any,
event?: MouseEvent,
queryText?: string,
splitterDirection?: "horizontal" | "vertical",
queryViewSizePercent?: number,
): void;
onNewMongoShellClick(): void;
onNewStoredProcedureClick(source: Collection, event?: MouseEvent): void;
onNewUserDefinedFunctionClick(source: Collection, event?: MouseEvent): void;
@@ -309,6 +323,8 @@ export interface QueryTabOptions extends TabOptions {
partitionKey?: DataModels.PartitionKey;
queryText?: string;
resourceTokenPartitionKey?: string;
splitterDirection?: "horizontal" | "vertical";
queryViewSizePercent?: number;
}
export interface ScriptTabOption extends TabOptions {
@@ -382,13 +398,14 @@ export interface DataExplorerInputsFrame {
databaseAccount: any;
subscriptionId?: string;
resourceGroup?: string;
tenantId?: string;
userName?: string;
masterKey?: string;
hasWriteAccess?: boolean;
authorizationToken?: string;
csmEndpoint?: string;
dnsSuffix?: string;
serverId?: string;
extensionEndpoint?: string;
portalBackendEndpoint?: string;
mongoProxyEndpoint?: string;
cassandraProxyEndpoint?: string;

View File

@@ -56,13 +56,15 @@ export const createDatabaseContextMenu = (container: Explorer, databaseId: strin
if (userContext.apiType !== "Tables" || userContext.features.enableSDKoperations) {
items.push({
iconSrc: DeleteDatabaseIcon,
onClick: () =>
useSidePanel
.getState()
.openSidePanel(
"Delete " + getDatabaseName(),
<DeleteDatabaseConfirmationPanel refreshDatabases={() => container.refreshAllDatabases()} />,
),
onClick: (lastFocusedElement?: React.RefObject<HTMLElement>) => {
(useSidePanel.getState().getRef = lastFocusedElement),
useSidePanel
.getState()
.openSidePanel(
"Delete " + getDatabaseName(),
<DeleteDatabaseConfirmationPanel refreshDatabases={() => container.refreshAllDatabases()} />,
);
},
label: `Delete ${getDatabaseName()}`,
styleClass: "deleteDatabaseMenuItem",
});
@@ -94,17 +96,17 @@ export const createCollectionContextMenuButton = (
iconSrc: HostedTerminalIcon,
onClick: () => {
const selectedCollection: ViewModels.Collection = useSelectedNode.getState().findSelectedCollection();
if (useNotebook.getState().isShellEnabled) {
if (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) {
container.openNotebookTerminal(ViewModels.TerminalKind.Mongo);
} else {
selectedCollection && selectedCollection.onNewMongoShellClick();
}
},
label: useNotebook.getState().isShellEnabled ? "Open Mongo Shell" : "New Shell",
label: (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) ? "Open Mongo Shell" : "New Shell",
});
}
if (useNotebook.getState().isShellEnabled && userContext.apiType === "Cassandra") {
if ((useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) && userContext.apiType === "Cassandra") {
items.push({
iconSrc: HostedTerminalIcon,
onClick: () => {
@@ -146,14 +148,15 @@ export const createCollectionContextMenuButton = (
if (configContext.platform !== Platform.Fabric) {
items.push({
iconSrc: DeleteCollectionIcon,
onClick: () => {
onClick: (lastFocusedElement?: React.RefObject<HTMLElement>) => {
useSelectedNode.getState().setSelectedNode(selectedCollection);
useSidePanel
.getState()
.openSidePanel(
"Delete " + getCollectionName(),
<DeleteCollectionConfirmationPane refreshDatabases={() => container.refreshAllDatabases()} />,
);
(useSidePanel.getState().getRef = lastFocusedElement),
useSidePanel
.getState()
.openSidePanel(
"Delete " + getCollectionName(),
<DeleteCollectionConfirmationPane refreshDatabases={() => container.refreshAllDatabases()} />,
);
},
label: `Delete ${getCollectionName()}`,
styleClass: "deleteCollectionMenuItem",

View File

@@ -1,4 +1,4 @@
import { DirectionalHint, Icon, Label, Stack, TooltipHost } from "@fluentui/react";
import { DirectionalHint, Icon, IconButton, Label, Stack, TooltipHost } from "@fluentui/react";
import * as React from "react";
import { NormalizedEventKey } from "../../../Common/Constants";
import { accordionStackTokens } from "../Settings/SettingsRenderUtils";
@@ -9,6 +9,9 @@ export interface CollapsibleSectionProps {
onExpand?: () => void;
children: JSX.Element;
tooltipContent?: string | JSX.Element | JSX.Element[];
showDelete?: boolean;
onDelete?: () => void;
disabled?: boolean;
}
export interface CollapsibleSectionState {
@@ -69,6 +72,20 @@ export class CollapsibleSectionComponent extends React.Component<CollapsibleSect
<Icon iconName="Info" className="panelInfoIcon" tabIndex={0} />
</TooltipHost>
)}
{this.props.showDelete && (
<Stack.Item style={{ marginLeft: "auto" }}>
<IconButton
disabled={this.props.disabled}
id={`delete-${this.props.title.split(" ").join("-")}`}
iconProps={{ iconName: "Delete" }}
style={{ height: 27, marginRight: "20px" }}
onClick={(event) => {
event.stopPropagation();
this.props.onDelete();
}}
/>
</Stack.Item>
)}
</Stack>
{this.state.isExpanded && this.props.children}
</>

View File

@@ -0,0 +1,6 @@
import "@testing-library/jest-dom";
describe("AddFullTextPolicyForm", () => {
//CTODO: add tests
it.skip("should render correctly", () => {});
});

View File

@@ -0,0 +1,239 @@
import {
DefaultButton,
Dropdown,
IDropdownOption,
IStyleFunctionOrObject,
ITextFieldStyleProps,
ITextFieldStyles,
Label,
Stack,
TextField,
} from "@fluentui/react";
import { FullTextIndex, FullTextPath, FullTextPolicy } from "Contracts/DataModels";
import { CollapsibleSectionComponent } from "Explorer/Controls/CollapsiblePanel/CollapsibleSectionComponent";
import * as React from "react";
export interface FullTextPoliciesComponentProps {
fullTextPolicy: FullTextPolicy;
onFullTextPathChange: (
fullTextPolicy: FullTextPolicy,
fullTextIndexes: FullTextIndex[],
validationPassed: boolean,
) => void;
discardChanges?: boolean;
onChangesDiscarded?: () => void;
}
export interface FullTextPolicyData {
path: string;
language: string;
pathError: string;
}
const labelStyles = {
root: {
fontSize: 12,
},
};
const textFieldStyles: IStyleFunctionOrObject<ITextFieldStyleProps, ITextFieldStyles> = {
fieldGroup: {
height: 27,
},
field: {
fontSize: 12,
padding: "0 8px",
},
};
const dropdownStyles = {
title: {
height: 27,
lineHeight: "24px",
fontSize: 12,
},
dropdown: {
height: 27,
lineHeight: "24px",
},
dropdownItem: {
fontSize: 12,
},
};
export const FullTextPoliciesComponent: React.FunctionComponent<FullTextPoliciesComponentProps> = ({
fullTextPolicy,
onFullTextPathChange,
discardChanges,
onChangesDiscarded,
}): JSX.Element => {
const getFullTextPathError = (path: string, index?: number): string => {
let error = "";
if (!path) {
error = "Full text path should not be empty";
}
if (
index >= 0 &&
fullTextPathData?.find(
(fullTextPath: FullTextPolicyData, dataIndex: number) => dataIndex !== index && fullTextPath.path === path,
)
) {
error = "Full text path is already defined";
}
return error;
};
const initializeData = (fullTextPolicy: FullTextPolicy): FullTextPolicyData[] => {
if (!fullTextPolicy) {
fullTextPolicy = { defaultLanguage: getFullTextLanguageOptions()[0].key as never, fullTextPaths: [] };
}
return fullTextPolicy.fullTextPaths.map((fullTextPath: FullTextPath) => ({
...fullTextPath,
pathError: getFullTextPathError(fullTextPath.path),
}));
};
const [fullTextPathData, setFullTextPathData] = React.useState<FullTextPolicyData[]>(initializeData(fullTextPolicy));
const [defaultLanguage, setDefaultLanguage] = React.useState<string>(
fullTextPolicy ? fullTextPolicy.defaultLanguage : (getFullTextLanguageOptions()[0].key as never),
);
React.useEffect(() => {
propagateData();
}, [fullTextPathData, defaultLanguage]);
React.useEffect(() => {
if (discardChanges) {
setFullTextPathData(initializeData(fullTextPolicy));
setDefaultLanguage(fullTextPolicy.defaultLanguage);
onChangesDiscarded();
}
}, [discardChanges]);
const propagateData = () => {
const newFullTextPolicy: FullTextPolicy = {
defaultLanguage: defaultLanguage,
fullTextPaths: fullTextPathData.map((policy: FullTextPolicyData) => ({
path: policy.path,
language: policy.language,
})),
};
const fullTextIndexes: FullTextIndex[] = fullTextPathData.map((policy) => ({
path: policy.path,
}));
const validationPassed = fullTextPathData.every((policy: FullTextPolicyData) => policy.pathError === "");
onFullTextPathChange(newFullTextPolicy, fullTextIndexes, validationPassed);
};
const onFullTextPathValueChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = event.target.value.trim();
const fullTextPaths = [...fullTextPathData];
if (!fullTextPaths[index]?.path && !value.startsWith("/")) {
fullTextPaths[index].path = "/" + value;
} else {
fullTextPaths[index].path = value;
}
fullTextPaths[index].pathError = getFullTextPathError(value, index);
setFullTextPathData(fullTextPaths);
};
const onFullTextPathPolicyChange = (index: number, option: IDropdownOption): void => {
const policies = [...fullTextPathData];
policies[index].language = option.key as never;
setFullTextPathData(policies);
};
const onAdd = () => {
setFullTextPathData([
...fullTextPathData,
{
path: "",
language: defaultLanguage,
pathError: getFullTextPathError(""),
},
]);
};
const onDelete = (index: number) => {
const policies = fullTextPathData.filter((_uniqueKey, j) => index !== j);
setFullTextPathData(policies);
};
return (
<Stack tokens={{ childrenGap: 4 }}>
<Stack style={{ marginBottom: 10 }}>
<Label styles={labelStyles}>Default language</Label>
<Dropdown
required={true}
styles={dropdownStyles}
options={getFullTextLanguageOptions()}
selectedKey={defaultLanguage}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
setDefaultLanguage(option.key as never)
}
></Dropdown>
</Stack>
{fullTextPathData &&
fullTextPathData.length > 0 &&
fullTextPathData.map((fullTextPolicy: FullTextPolicyData, index: number) => (
<CollapsibleSectionComponent
key={index}
isExpandedByDefault={true}
title={`Full text path ${index + 1}`}
showDelete={true}
onDelete={() => onDelete(index)}
>
<Stack horizontal tokens={{ childrenGap: 4 }}>
<Stack
styles={{
root: {
margin: "0 0 6px 20px !important",
paddingLeft: 20,
width: "80%",
borderLeft: "1px solid",
},
}}
>
<Stack>
<Label styles={labelStyles}>Path</Label>
<TextField
id={`full-text-policy-path-${index + 1}`}
required={true}
placeholder="/fullTextPath1"
styles={textFieldStyles}
onChange={(event: React.ChangeEvent<HTMLInputElement>) => onFullTextPathValueChange(index, event)}
value={fullTextPolicy.path || ""}
errorMessage={fullTextPolicy.pathError}
/>
</Stack>
<Stack>
<Label styles={labelStyles}>Language</Label>
<Dropdown
required={true}
styles={dropdownStyles}
options={getFullTextLanguageOptions()}
selectedKey={fullTextPolicy.language}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onFullTextPathPolicyChange(index, option)
}
></Dropdown>
</Stack>
</Stack>
</Stack>
</CollapsibleSectionComponent>
))}
<DefaultButton id={`add-vector-policy`} styles={{ root: { maxWidth: 170, fontSize: 12 } }} onClick={onAdd}>
Add full text path
</DefaultButton>
</Stack>
);
};
export const getFullTextLanguageOptions = (): IDropdownOption[] => {
return [
{
key: "en-US",
text: "English (US)",
},
];
};

View File

@@ -0,0 +1,314 @@
// This component is used to create a dropdown list of options for the user to select from.
// The options are displayed in a dropdown list when the user clicks on the input field.
// The user can then select an option from the list. The selected option is then displayed in the input field.
import { getTheme } from "@fluentui/react";
import {
Button,
Divider,
Input,
Link,
makeStyles,
Popover,
PopoverProps,
PopoverSurface,
PositioningImperativeRef,
} from "@fluentui/react-components";
import { ArrowDownRegular, DismissRegular } from "@fluentui/react-icons";
import { NormalizedEventKey } from "Common/Constants";
import { tokens } from "Explorer/Theme/ThemeUtil";
import React, { FC, useEffect, useRef } from "react";
const useStyles = makeStyles({
container: {
padding: 0,
},
input: {
flexGrow: 1,
paddingRight: 0,
outline: "none",
"& input:focus": {
outline: "none", // Undo body :focus dashed outline
},
},
inputButton: {
border: 0,
},
dropdownHeader: {
width: "100%",
fontSize: tokens.fontSizeBase300,
fontWeight: 600,
padding: `${tokens.spacingVerticalM} 0 0 ${tokens.spacingVerticalM}`,
},
dropdownStack: {
display: "flex",
flexDirection: "column",
gap: tokens.spacingVerticalS,
marginTop: tokens.spacingVerticalS,
marginBottom: "1px",
},
dropdownOption: {
fontSize: tokens.fontSizeBase300,
fontWeight: 400,
justifyContent: "left",
padding: `${tokens.spacingHorizontalXS} ${tokens.spacingHorizontalS} ${tokens.spacingHorizontalXS} ${tokens.spacingHorizontalL}`,
overflow: "hidden",
whiteSpace: "nowrap",
textOverflow: "ellipsis",
border: 0,
":hover": {
outline: `1px dashed ${tokens.colorNeutralForeground1Hover}`,
backgroundColor: tokens.colorNeutralBackground2Hover,
color: tokens.colorNeutralForeground1,
},
},
bottomSection: {
fontSize: tokens.fontSizeBase300,
fontWeight: 400,
padding: `${tokens.spacingHorizontalXS} ${tokens.spacingHorizontalS} ${tokens.spacingHorizontalXS} ${tokens.spacingHorizontalL}`,
overflow: "hidden",
whiteSpace: "nowrap",
textOverflow: "ellipsis",
},
});
export interface InputDatalistDropdownOptionSection {
label: string;
options: string[];
}
export interface InputDataListProps {
dropdownOptions: InputDatalistDropdownOptionSection[];
placeholder?: string;
title?: string;
value: string;
onChange: (value: string) => void;
onKeyDown?: (e: React.KeyboardEvent<HTMLInputElement>) => void;
autofocus?: boolean; // true: acquire focus on first render
bottomLink?: {
text: string;
url: string;
};
}
export const InputDataList: FC<InputDataListProps> = ({
dropdownOptions,
placeholder,
title,
value,
onChange,
onKeyDown,
autofocus,
bottomLink,
}) => {
const styles = useStyles();
const [showDropdown, setShowDropdown] = React.useState(false);
const inputRef = useRef<HTMLInputElement>(null);
const positioningRef = React.useRef<PositioningImperativeRef>(null);
const [isInputFocused, setIsInputFocused] = React.useState(autofocus);
const [autofocusFirstDropdownItem, setAutofocusFirstDropdownItem] = React.useState(false);
const theme = getTheme();
const itemRefs = useRef([]);
useEffect(() => {
if (inputRef.current) {
positioningRef.current?.setTarget(inputRef.current);
}
}, [inputRef, positioningRef]);
useEffect(() => {
if (isInputFocused) {
inputRef.current?.focus();
}
}, [isInputFocused]);
useEffect(() => {
if (autofocusFirstDropdownItem && showDropdown) {
// Autofocus on first item if input isn't focused
itemRefs.current[0]?.focus();
setAutofocusFirstDropdownItem(false);
}
}, [autofocusFirstDropdownItem, showDropdown]);
const handleOpenChange: PopoverProps["onOpenChange"] = (e, data) => {
if (isInputFocused && !data.open) {
// Don't close if input is focused and we're opening the dropdown (which will steal the focus)
return;
}
setShowDropdown(data.open || false);
if (data.open) {
setIsInputFocused(true);
}
};
const handleInputKeyDown = (e: React.KeyboardEvent<HTMLInputElement>) => {
if (e.key === NormalizedEventKey.Escape) {
setShowDropdown(false);
} else if (e.key === NormalizedEventKey.DownArrow) {
setShowDropdown(true);
setAutofocusFirstDropdownItem(true);
}
onKeyDown(e);
};
const handleDownDropdownItemKeyDown = (
e: React.KeyboardEvent<HTMLButtonElement | HTMLAnchorElement>,
index: number,
) => {
if (e.key === NormalizedEventKey.Enter) {
e.currentTarget.click();
} else if (e.key === NormalizedEventKey.Escape) {
setShowDropdown(false);
inputRef.current?.focus();
} else if (e.key === NormalizedEventKey.DownArrow) {
if (index + 1 < itemRefs.current.length) {
itemRefs.current[index + 1].focus();
} else {
setIsInputFocused(true);
}
} else if (e.key === NormalizedEventKey.UpArrow) {
if (index - 1 >= 0) {
itemRefs.current[index - 1].focus();
} else {
// Last item, focus back to input
setIsInputFocused(true);
}
}
};
// Flatten dropdownOptions to better manage refs and focus
let flatIndex = 0;
const indexMap = new Map<string, number>();
for (let sectionIndex = 0; sectionIndex < dropdownOptions.length; sectionIndex++) {
const section = dropdownOptions[sectionIndex];
for (let optionIndex = 0; optionIndex < section.options.length; optionIndex++) {
indexMap.set(`${sectionIndex}-${optionIndex}`, flatIndex);
flatIndex++;
}
}
return (
<>
<Input
id="filterInput"
ref={inputRef}
type="text"
size="small"
autoComplete="off"
className={`filterInput ${styles.input}`}
title={title}
placeholder={placeholder}
value={value}
autoFocus
onKeyDown={handleInputKeyDown}
onChange={(e) => {
const newValue = e.target.value;
// Don't show dropdown if there is already a value in the input field (when user is typing)
setShowDropdown(!(newValue.length > 0));
onChange(newValue);
}}
onClick={(e) => {
e.stopPropagation();
}}
onFocus={() => {
// Don't show dropdown if there is already a value in the input field
// or isInputFocused is undefined which means component is mounting
setShowDropdown(!(value.length > 0) && isInputFocused !== undefined);
setIsInputFocused(true);
}}
onBlur={() => {
setIsInputFocused(false);
}}
contentAfter={
value.length > 0 ? (
<Button
aria-label="Clear filter"
className={styles.inputButton}
size="small"
icon={<DismissRegular />}
onClick={() => {
onChange("");
setIsInputFocused(true);
}}
/>
) : (
<Button
aria-label="Open dropdown"
className={styles.inputButton}
size="small"
icon={<ArrowDownRegular />}
onClick={() => {
setShowDropdown(true);
setAutofocusFirstDropdownItem(true);
}}
/>
)
}
/>
<Popover
inline
unstable_disableAutoFocus
// trapFocus
open={showDropdown}
onOpenChange={handleOpenChange}
positioning={{ positioningRef, position: "below", align: "start", offset: 4 }}
>
<PopoverSurface className={styles.container}>
{dropdownOptions.map((section, sectionIndex) => (
<div key={section.label}>
<div className={styles.dropdownHeader} style={{ color: theme.palette.themePrimary }}>
{section.label}
</div>
<div className={styles.dropdownStack}>
{section.options.map((option, index) => (
<Button
key={option}
ref={(el) => (itemRefs.current[indexMap.get(`${sectionIndex}-${index}`)] = el)}
appearance="transparent"
shape="square"
className={styles.dropdownOption}
onClick={() => {
onChange(option);
setShowDropdown(false);
setIsInputFocused(true);
}}
onBlur={() =>
!bottomLink &&
sectionIndex === dropdownOptions.length - 1 &&
index === section.options.length - 1 &&
setShowDropdown(false)
}
onKeyDown={(e: React.KeyboardEvent<HTMLButtonElement>) =>
handleDownDropdownItemKeyDown(e, indexMap.get(`${sectionIndex}-${index}`))
}
>
{option}
</Button>
))}
</div>
</div>
))}
{bottomLink && (
<>
<Divider />
<div className={styles.bottomSection}>
<Link
ref={(el) => (itemRefs.current[flatIndex] = el)}
href={bottomLink.url}
target="_blank"
onBlur={() => setShowDropdown(false)}
onKeyDown={(e: React.KeyboardEvent<HTMLAnchorElement>) => handleDownDropdownItemKeyDown(e, flatIndex)}
>
{bottomLink.text}
</Link>
</div>
</>
)}
</PopoverSurface>
</Popover>
</>
);
};

View File

@@ -1,5 +1,7 @@
import { AuthType } from "AuthType";
import { shallow } from "enzyme";
import ko from "knockout";
import { Features } from "Platform/Hosted/extractFeatures";
import React from "react";
import { updateCollection } from "../../../Common/dataAccess/updateCollection";
import { updateOffer } from "../../../Common/dataAccess/updateOffer";
@@ -247,4 +249,42 @@ describe("SettingsComponent", () => {
expect(conflictResolutionPolicy.mode).toEqual(DataModels.ConflictResolutionMode.Custom);
expect(conflictResolutionPolicy.conflictResolutionProcedure).toEqual(expectSprocPath);
});
it("should save throughput bucket changes when Save button is clicked", async () => {
updateUserContext({
apiType: "SQL",
features: { enableThroughputBuckets: true } as Features,
authType: AuthType.AAD,
});
const wrapper = shallow(<SettingsComponent {...baseProps} />);
const settingsComponentInstance = wrapper.instance() as SettingsComponent;
const isEnabled = settingsComponentInstance["throughputBucketsEnabled"];
expect(isEnabled).toBe(true);
wrapper.setState({
isThroughputBucketsSaveable: true,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
await settingsComponentInstance.onSaveClick();
expect(updateOffer).toHaveBeenCalledWith({
databaseId: collection.databaseId,
collectionId: collection.id(),
currentOffer: expect.any(Object),
autopilotThroughput: collection.offer().autoscaleMaxThroughput,
manualThroughput: collection.offer().manualThroughput,
throughputBuckets: [
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
],
});
expect(wrapper.state("isThroughputBucketsSaveable")).toBe(false);
});
});

View File

@@ -4,11 +4,15 @@ import {
ComputedPropertiesComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ComputedPropertiesComponent";
import {
ContainerVectorPolicyComponent,
ContainerVectorPolicyComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ContainerVectorPolicyComponent";
ContainerPolicyComponent,
ContainerPolicyComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ContainerPolicyComponent";
import {
ThroughputBucketsComponent,
ThroughputBucketsComponentProps,
} from "Explorer/Controls/Settings/SettingsSubComponents/ThroughputInputComponents/ThroughputBucketsComponent";
import { useDatabases } from "Explorer/useDatabases";
import { isVectorSearchEnabled } from "Utils/CapabilityUtils";
import { isFullTextSearchEnabled, isVectorSearchEnabled } from "Utils/CapabilityUtils";
import { isRunningOnPublicCloud } from "Utils/CloudUtils";
import * as React from "react";
import DiscardIcon from "../../../../images/discard.svg";
@@ -86,6 +90,8 @@ export interface SettingsComponentState {
wasAutopilotOriginallySet: boolean;
isScaleSaveable: boolean;
isScaleDiscardable: boolean;
throughputBuckets: DataModels.ThroughputBucket[];
throughputBucketsBaseline: DataModels.ThroughputBucket[];
throughputError: string;
timeToLive: TtlType;
@@ -104,6 +110,14 @@ export interface SettingsComponentState {
changeFeedPolicyBaseline: ChangeFeedPolicyState;
isSubSettingsSaveable: boolean;
isSubSettingsDiscardable: boolean;
isThroughputBucketsSaveable: boolean;
vectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy;
vectorEmbeddingPolicyBaseline: DataModels.VectorEmbeddingPolicy;
fullTextPolicy: DataModels.FullTextPolicy;
fullTextPolicyBaseline: DataModels.FullTextPolicy;
shouldDiscardContainerPolicies: boolean;
isContainerPolicyDirty: boolean;
indexingPolicyContent: DataModels.IndexingPolicy;
indexingPolicyContentBaseline: DataModels.IndexingPolicy;
@@ -149,7 +163,9 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private shouldShowIndexingPolicyEditor: boolean;
private shouldShowPartitionKeyEditor: boolean;
private isVectorSearchEnabled: boolean;
private isFullTextSearchEnabled: boolean;
private totalThroughputUsed: number;
private throughputBucketsEnabled: boolean;
public mongoDBCollectionResource: MongoDBCollectionResource;
constructor(props: SettingsComponentProps) {
@@ -164,8 +180,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.shouldShowIndexingPolicyEditor = userContext.apiType !== "Cassandra" && userContext.apiType !== "Mongo";
this.shouldShowPartitionKeyEditor = userContext.apiType === "SQL" && isRunningOnPublicCloud();
this.isVectorSearchEnabled = isVectorSearchEnabled() && !hasDatabaseSharedThroughput(this.collection);
this.isFullTextSearchEnabled = isFullTextSearchEnabled() && !hasDatabaseSharedThroughput(this.collection);
this.changeFeedPolicyVisible = userContext.features.enableChangeFeedPolicy;
this.throughputBucketsEnabled =
userContext.apiType === "SQL" &&
userContext.features.enableThroughputBuckets &&
userContext.authType === AuthType.AAD;
// Mongo container with system partition key still treat as "Fixed"
this.isFixedContainer =
@@ -184,6 +205,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
wasAutopilotOriginallySet: false,
isScaleSaveable: false,
isScaleDiscardable: false,
throughputBuckets: undefined,
throughputBucketsBaseline: undefined,
throughputError: undefined,
timeToLive: undefined,
@@ -202,6 +225,14 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
changeFeedPolicyBaseline: undefined,
isSubSettingsSaveable: false,
isSubSettingsDiscardable: false,
isThroughputBucketsSaveable: false,
vectorEmbeddingPolicy: undefined,
vectorEmbeddingPolicyBaseline: undefined,
fullTextPolicy: undefined,
fullTextPolicyBaseline: undefined,
shouldDiscardContainerPolicies: false,
isContainerPolicyDirty: false,
indexingPolicyContent: undefined,
indexingPolicyContentBaseline: undefined,
@@ -307,10 +338,12 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
return (
this.state.isScaleSaveable ||
this.state.isSubSettingsSaveable ||
this.state.isContainerPolicyDirty ||
this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable)
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicySaveable) ||
this.state.isThroughputBucketsSaveable
);
};
@@ -318,10 +351,12 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
return (
this.state.isScaleDiscardable ||
this.state.isSubSettingsDiscardable ||
this.state.isContainerPolicyDirty ||
this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty ||
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable)
(!!this.state.currentMongoIndexes && this.state.isMongoIndexingPolicyDiscardable) ||
this.state.isThroughputBucketsSaveable
);
};
@@ -401,10 +436,14 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({
throughput: this.state.throughputBaseline,
throughputBuckets: this.state.throughputBucketsBaseline,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
timeToLive: this.state.timeToLiveBaseline,
timeToLiveSeconds: this.state.timeToLiveSecondsBaseline,
displayedTtlSeconds: this.state.displayedTtlSecondsBaseline,
geospatialConfigType: this.state.geospatialConfigTypeBaseline,
vectorEmbeddingPolicy: this.state.vectorEmbeddingPolicyBaseline,
fullTextPolicy: this.state.fullTextPolicyBaseline,
indexingPolicyContent: this.state.indexingPolicyContentBaseline,
indexesToAdd: [],
indexesToDrop: [],
@@ -416,11 +455,14 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
changeFeedPolicy: this.state.changeFeedPolicyBaseline,
autoPilotThroughput: this.state.autoPilotThroughputBaseline,
isAutoPilotSelected: this.state.wasAutopilotOriginallySet,
shouldDiscardContainerPolicies: true,
shouldDiscardIndexingPolicy: true,
isScaleSaveable: false,
isScaleDiscardable: false,
isSubSettingsSaveable: false,
isThroughputBucketsSaveable: false,
isSubSettingsDiscardable: false,
isContainerPolicyDirty: false,
isIndexingPolicyDirty: false,
isMongoIndexingPolicySaveable: false,
isMongoIndexingPolicyDiscardable: false,
@@ -448,9 +490,21 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private onScaleDiscardableChange = (isScaleDiscardable: boolean): void =>
this.setState({ isScaleDiscardable: isScaleDiscardable });
private onVectorEmbeddingPolicyChange = (newVectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy): void =>
this.setState({ vectorEmbeddingPolicy: newVectorEmbeddingPolicy });
private onFullTextPolicyChange = (newFullTextPolicy: DataModels.FullTextPolicy): void =>
this.setState({ fullTextPolicy: newFullTextPolicy });
private onIndexingPolicyContentChange = (newIndexingPolicy: DataModels.IndexingPolicy): void =>
this.setState({ indexingPolicyContent: newIndexingPolicy });
private onThroughputBucketsSaveableChange = (isSaveable: boolean): void => {
this.setState({ isThroughputBucketsSaveable: isSaveable });
};
private resetShouldDiscardContainerPolicies = (): void => this.setState({ shouldDiscardContainerPolicies: false });
private resetShouldDiscardIndexingPolicy = (): void => this.setState({ shouldDiscardIndexingPolicy: false });
private logIndexingPolicySuccessMessage = (): void => {
@@ -538,6 +592,12 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
private onSubSettingsDiscardableChange = (isSubSettingsDiscardable: boolean): void =>
this.setState({ isSubSettingsDiscardable: isSubSettingsDiscardable });
private onVectorEmbeddingPolicyDirtyChange = (isVectorEmbeddingPolicyDirty: boolean): void =>
this.setState({ isContainerPolicyDirty: isVectorEmbeddingPolicyDirty });
private onFullTextPolicyDirtyChange = (isFullTextPolicyDirty: boolean): void =>
this.setState({ isContainerPolicyDirty: isFullTextPolicyDirty });
private onIndexingPolicyDirtyChange = (isIndexingPolicyDirty: boolean): void =>
this.setState({ isIndexingPolicyDirty: isIndexingPolicyDirty });
@@ -691,6 +751,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
const changeFeedPolicy = this.collection.rawDataModel?.changeFeedPolicy
? ChangeFeedPolicyState.On
: ChangeFeedPolicyState.Off;
const vectorEmbeddingPolicy: DataModels.VectorEmbeddingPolicy =
this.collection.vectorEmbeddingPolicy && this.collection.vectorEmbeddingPolicy();
const fullTextPolicy: DataModels.FullTextPolicy =
this.collection.fullTextPolicy && this.collection.fullTextPolicy();
const indexingPolicyContent = this.collection.indexingPolicy();
const conflictResolutionPolicy: DataModels.ConflictResolutionPolicy =
this.collection.conflictResolutionPolicy && this.collection.conflictResolutionPolicy();
@@ -709,9 +773,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
] as DataModels.ComputedProperties;
}
const throughputBuckets = this.offer?.throughputBuckets;
return {
throughput: offerThroughput,
throughputBaseline: offerThroughput,
throughputBuckets,
throughputBucketsBaseline: throughputBuckets,
changeFeedPolicy: changeFeedPolicy,
changeFeedPolicyBaseline: changeFeedPolicy,
timeToLive: timeToLive,
@@ -724,6 +792,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
analyticalStorageTtlSelectionBaseline: analyticalStorageTtlSelection,
analyticalStorageTtlSeconds: analyticalStorageTtlSeconds,
analyticalStorageTtlSecondsBaseline: analyticalStorageTtlSeconds,
vectorEmbeddingPolicy: vectorEmbeddingPolicy,
vectorEmbeddingPolicyBaseline: vectorEmbeddingPolicy,
fullTextPolicy: fullTextPolicy,
fullTextPolicyBaseline: fullTextPolicy,
indexingPolicyContent: indexingPolicyContent,
indexingPolicyContentBaseline: indexingPolicyContent,
conflictResolutionPolicyMode: conflictResolutionPolicyMode,
@@ -795,6 +867,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({ throughput: newThroughput, throughputError });
};
private onThroughputBucketChange = (throughputBuckets: DataModels.ThroughputBucket[]): void => {
this.setState({ throughputBuckets });
};
private onAutoPilotSelected = (isAutoPilotSelected: boolean): void =>
this.setState({ isAutoPilotSelected: isAutoPilotSelected });
@@ -854,6 +930,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
if (
this.state.isSubSettingsSaveable ||
this.state.isContainerPolicyDirty ||
this.state.isIndexingPolicyDirty ||
this.state.isConflictResolutionDirty ||
this.state.isComputedPropertiesDirty
@@ -875,6 +952,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
const wasIndexingPolicyModified = this.state.isIndexingPolicyDirty;
newCollection.defaultTtl = defaultTtl;
newCollection.vectorEmbeddingPolicy = this.state.vectorEmbeddingPolicy;
newCollection.fullTextPolicy = this.state.fullTextPolicy;
newCollection.indexingPolicy = this.state.indexingPolicyContent;
newCollection.changeFeedPolicy =
@@ -913,6 +994,8 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.collection.changeFeedPolicy(updatedCollection.changeFeedPolicy);
this.collection.geospatialConfig(updatedCollection.geospatialConfig);
this.collection.computedProperties(updatedCollection.computedProperties);
this.collection.vectorEmbeddingPolicy(updatedCollection.vectorEmbeddingPolicy);
this.collection.fullTextPolicy(updatedCollection.fullTextPolicy);
if (wasIndexingPolicyModified) {
await this.refreshIndexTransformationProgress();
@@ -921,6 +1004,7 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
this.setState({
isSubSettingsSaveable: false,
isSubSettingsDiscardable: false,
isContainerPolicyDirty: false,
isIndexingPolicyDirty: false,
isConflictResolutionDirty: false,
isComputedPropertiesDirty: false,
@@ -977,6 +1061,24 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
}
}
if (this.throughputBucketsEnabled && this.state.isThroughputBucketsSaveable) {
const updatedOffer: DataModels.Offer = await updateOffer({
databaseId: this.collection.databaseId,
collectionId: this.collection.id(),
currentOffer: this.collection.offer(),
autopilotThroughput: this.collection.offer().autoscaleMaxThroughput
? this.collection.offer().autoscaleMaxThroughput
: undefined,
manualThroughput: this.collection.offer().manualThroughput
? this.collection.offer().manualThroughput
: undefined,
throughputBuckets: this.state.throughputBuckets,
});
this.collection.offer(updatedOffer);
this.offer = updatedOffer;
this.setState({ isThroughputBucketsSaveable: false });
}
if (this.state.isScaleSaveable) {
const updateOfferParams: DataModels.UpdateOfferParams = {
databaseId: this.collection.databaseId,
@@ -1091,6 +1193,21 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
onSubSettingsDiscardableChange: this.onSubSettingsDiscardableChange,
};
const containerPolicyComponentProps: ContainerPolicyComponentProps = {
vectorEmbeddingPolicy: this.state.vectorEmbeddingPolicy,
vectorEmbeddingPolicyBaseline: this.state.vectorEmbeddingPolicyBaseline,
onVectorEmbeddingPolicyChange: this.onVectorEmbeddingPolicyChange,
onVectorEmbeddingPolicyDirtyChange: this.onVectorEmbeddingPolicyDirtyChange,
isVectorSearchEnabled: this.isVectorSearchEnabled,
fullTextPolicy: this.state.fullTextPolicy,
fullTextPolicyBaseline: this.state.fullTextPolicyBaseline,
onFullTextPolicyChange: this.onFullTextPolicyChange,
onFullTextPolicyDirtyChange: this.onFullTextPolicyDirtyChange,
isFullTextSearchEnabled: this.isFullTextSearchEnabled,
shouldDiscardContainerPolicies: this.state.shouldDiscardContainerPolicies,
resetShouldDiscardContainerPolicyChange: this.resetShouldDiscardContainerPolicies,
};
const indexingPolicyComponentProps: IndexingPolicyComponentProps = {
shouldDiscardIndexingPolicy: this.state.shouldDiscardIndexingPolicy,
resetShouldDiscardIndexingPolicy: this.resetShouldDiscardIndexingPolicy,
@@ -1142,16 +1259,19 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
onConflictResolutionDirtyChange: this.onConflictResolutionDirtyChange,
};
const throughputBucketsComponentProps: ThroughputBucketsComponentProps = {
currentBuckets: this.state.throughputBuckets,
throughputBucketsBaseline: this.state.throughputBucketsBaseline,
onBucketsChange: this.onThroughputBucketChange,
onSaveableChange: this.onThroughputBucketsSaveableChange,
};
const partitionKeyComponentProps: PartitionKeyComponentProps = {
database: useDatabases.getState().findDatabaseWithId(this.collection.databaseId),
collection: this.collection,
explorer: this.props.settingsTab.getContainer(),
};
const containerVectorPolicyProps: ContainerVectorPolicyComponentProps = {
vectorEmbeddingPolicy: this.collection.rawDataModel?.vectorEmbeddingPolicy,
};
const tabs: SettingsV2TabInfo[] = [];
if (!hasDatabaseSharedThroughput(this.collection) && this.offer) {
tabs.push({
@@ -1165,10 +1285,10 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
content: <SubSettingsComponent {...subSettingsComponentProps} />,
});
if (this.isVectorSearchEnabled) {
if (this.isVectorSearchEnabled || this.isFullTextSearchEnabled) {
tabs.push({
tab: SettingsV2TabTypes.ContainerVectorPolicyTab,
content: <ContainerVectorPolicyComponent {...containerVectorPolicyProps} />,
content: <ContainerPolicyComponent {...containerPolicyComponentProps} />,
});
}
@@ -1208,6 +1328,13 @@ export class SettingsComponent extends React.Component<SettingsComponentProps, S
});
}
if (this.throughputBucketsEnabled) {
tabs.push({
tab: SettingsV2TabTypes.ThroughputBucketsTab,
content: <ThroughputBucketsComponent {...throughputBucketsComponentProps} />,
});
}
const pivotProps: IPivotProps = {
onLinkClick: this.onPivotChange,
selectedKey: SettingsV2TabTypes[this.state.selectedTab],

View File

@@ -0,0 +1,6 @@
import "@testing-library/jest-dom";
describe("ContainerPolicyComponent", () => {
//CTODO: add tests
it.skip("should render correctly", () => {});
});

View File

@@ -0,0 +1,163 @@
import { DefaultButton, Pivot, PivotItem, Stack } from "@fluentui/react";
import { FullTextPolicy, VectorEmbedding, VectorEmbeddingPolicy } from "Contracts/DataModels";
import {
FullTextPoliciesComponent,
getFullTextLanguageOptions,
} from "Explorer/Controls/FullTextSeach/FullTextPoliciesComponent";
import { titleAndInputStackProps } from "Explorer/Controls/Settings/SettingsRenderUtils";
import { ContainerPolicyTabTypes, isDirty } from "Explorer/Controls/Settings/SettingsUtils";
import { VectorEmbeddingPoliciesComponent } from "Explorer/Controls/VectorSearch/VectorEmbeddingPoliciesComponent";
import React from "react";
export interface ContainerPolicyComponentProps {
vectorEmbeddingPolicy: VectorEmbeddingPolicy;
vectorEmbeddingPolicyBaseline: VectorEmbeddingPolicy;
onVectorEmbeddingPolicyChange: (newVectorEmbeddingPolicy: VectorEmbeddingPolicy) => void;
onVectorEmbeddingPolicyDirtyChange: (isVectorEmbeddingPolicyDirty: boolean) => void;
isVectorSearchEnabled: boolean;
fullTextPolicy: FullTextPolicy;
fullTextPolicyBaseline: FullTextPolicy;
onFullTextPolicyChange: (newFullTextPolicy: FullTextPolicy) => void;
onFullTextPolicyDirtyChange: (isFullTextPolicyDirty: boolean) => void;
isFullTextSearchEnabled: boolean;
shouldDiscardContainerPolicies: boolean;
resetShouldDiscardContainerPolicyChange: () => void;
}
export const ContainerPolicyComponent: React.FC<ContainerPolicyComponentProps> = ({
vectorEmbeddingPolicy,
vectorEmbeddingPolicyBaseline,
onVectorEmbeddingPolicyChange,
onVectorEmbeddingPolicyDirtyChange,
isVectorSearchEnabled,
fullTextPolicy,
fullTextPolicyBaseline,
onFullTextPolicyChange,
onFullTextPolicyDirtyChange,
isFullTextSearchEnabled,
shouldDiscardContainerPolicies,
resetShouldDiscardContainerPolicyChange,
}) => {
const [selectedTab, setSelectedTab] = React.useState<ContainerPolicyTabTypes>(
ContainerPolicyTabTypes.VectorPolicyTab,
);
const [vectorEmbeddings, setVectorEmbeddings] = React.useState<VectorEmbedding[]>();
const [vectorEmbeddingsBaseline, setVectorEmbeddingsBaseline] = React.useState<VectorEmbedding[]>();
const [discardVectorChanges, setDiscardVectorChanges] = React.useState<boolean>(false);
const [fullTextSearchPolicy, setFullTextSearchPolicy] = React.useState<FullTextPolicy>();
const [fullTextSearchPolicyBaseline, setFullTextSearchPolicyBaseline] = React.useState<FullTextPolicy>();
const [discardFullTextChanges, setDiscardFullTextChanges] = React.useState<boolean>(false);
React.useEffect(() => {
setVectorEmbeddings(vectorEmbeddingPolicy?.vectorEmbeddings);
setVectorEmbeddingsBaseline(vectorEmbeddingPolicyBaseline?.vectorEmbeddings);
}, [vectorEmbeddingPolicy]);
React.useEffect(() => {
setFullTextSearchPolicy(fullTextPolicy);
setFullTextSearchPolicyBaseline(fullTextPolicyBaseline);
}, [fullTextPolicy, fullTextPolicyBaseline]);
React.useEffect(() => {
if (shouldDiscardContainerPolicies) {
setVectorEmbeddings(vectorEmbeddingPolicyBaseline?.vectorEmbeddings);
setDiscardVectorChanges(true);
setFullTextSearchPolicy(fullTextPolicyBaseline);
setDiscardFullTextChanges(true);
resetShouldDiscardContainerPolicyChange();
}
});
const checkAndSendVectorEmbeddingPoliciesToSettings = (newVectorEmbeddings: VectorEmbedding[]): void => {
if (isDirty(newVectorEmbeddings, vectorEmbeddingsBaseline)) {
onVectorEmbeddingPolicyDirtyChange(true);
onVectorEmbeddingPolicyChange({ vectorEmbeddings: newVectorEmbeddings });
} else {
resetShouldDiscardContainerPolicyChange();
}
};
const checkAndSendFullTextPolicyToSettings = (newFullTextPolicy: FullTextPolicy): void => {
if (isDirty(newFullTextPolicy, fullTextSearchPolicyBaseline)) {
onFullTextPolicyDirtyChange(true);
onFullTextPolicyChange(newFullTextPolicy);
} else {
resetShouldDiscardContainerPolicyChange();
}
};
const onVectorChangesDiscarded = (): void => {
setDiscardVectorChanges(false);
};
const onFullTextChangesDiscarded = (): void => {
setDiscardFullTextChanges(false);
};
const onPivotChange = (item: PivotItem): void => {
const selectedTab = ContainerPolicyTabTypes[item.props.itemKey as keyof typeof ContainerPolicyTabTypes];
setSelectedTab(selectedTab);
};
return (
<div>
<Pivot onLinkClick={onPivotChange} selectedKey={ContainerPolicyTabTypes[selectedTab]}>
{isVectorSearchEnabled && (
<PivotItem
itemKey={ContainerPolicyTabTypes[ContainerPolicyTabTypes.VectorPolicyTab]}
style={{ marginTop: 20 }}
headerText="Vector Policy"
>
<Stack {...titleAndInputStackProps} styles={{ root: { position: "relative", maxWidth: "400px" } }}>
{vectorEmbeddings && (
<VectorEmbeddingPoliciesComponent
disabled={true}
vectorEmbeddings={vectorEmbeddings}
vectorIndexes={undefined}
onVectorEmbeddingChange={(vectorEmbeddings: VectorEmbedding[]) =>
checkAndSendVectorEmbeddingPoliciesToSettings(vectorEmbeddings)
}
discardChanges={discardVectorChanges}
onChangesDiscarded={onVectorChangesDiscarded}
/>
)}
</Stack>
</PivotItem>
)}
{isFullTextSearchEnabled && (
<PivotItem
itemKey={ContainerPolicyTabTypes[ContainerPolicyTabTypes.FullTextPolicyTab]}
style={{ marginTop: 20 }}
headerText="Full Text Policy"
>
<Stack {...titleAndInputStackProps} styles={{ root: { position: "relative", maxWidth: "400px" } }}>
{fullTextSearchPolicy ? (
<FullTextPoliciesComponent
fullTextPolicy={fullTextSearchPolicy}
onFullTextPathChange={(newFullTextPolicy: FullTextPolicy) =>
checkAndSendFullTextPolicyToSettings(newFullTextPolicy)
}
discardChanges={discardFullTextChanges}
onChangesDiscarded={onFullTextChangesDiscarded}
/>
) : (
<DefaultButton
id={"create-full-text-policy"}
styles={{ root: { fontSize: 12 } }}
onClick={() => {
checkAndSendFullTextPolicyToSettings({
defaultLanguage: getFullTextLanguageOptions()[0].key as never,
fullTextPaths: [],
});
}}
>
Create new full text search policy
</DefaultButton>
)}
</Stack>
</PivotItem>
)}
</Pivot>
</div>
);
};

View File

@@ -1,30 +0,0 @@
import { Stack } from "@fluentui/react";
import { VectorEmbeddingPolicy } from "Contracts/DataModels";
import { EditorReact } from "Explorer/Controls/Editor/EditorReact";
import { titleAndInputStackProps } from "Explorer/Controls/Settings/SettingsRenderUtils";
import React from "react";
export interface ContainerVectorPolicyComponentProps {
vectorEmbeddingPolicy: VectorEmbeddingPolicy;
}
export const ContainerVectorPolicyComponent: React.FC<ContainerVectorPolicyComponentProps> = ({
vectorEmbeddingPolicy,
}) => {
return (
<Stack {...titleAndInputStackProps} styles={{ root: { position: "relative" } }}>
<EditorReact
language={"json"}
content={JSON.stringify(vectorEmbeddingPolicy || {}, null, 4)}
isReadOnly={true}
wordWrap={"on"}
ariaLabel={"Container vector policy"}
lineNumbers={"on"}
scrollBeyondLastLine={false}
className={"settingsV2Editor"}
spinnerClassName={"settingsV2EditorSpinner"}
fontSize={14}
/>
</Stack>
);
};

View File

@@ -120,11 +120,6 @@ export class IndexingPolicyComponent extends React.Component<
indexTransformationProgress={this.props.indexTransformationProgress}
refreshIndexTransformationProgress={this.props.refreshIndexTransformationProgress}
/>
{this.props.isVectorSearchEnabled && (
<MessageBar messageBarType={MessageBarType.severeWarning}>
Container vector policies and vector indexes are not modifiable after container creation
</MessageBar>
)}
{isDirty(this.props.indexingPolicyContent, this.props.indexingPolicyContentBaseline) && (
<MessageBar messageBarType={MessageBarType.warning}>{unsavedEditorWarningMessage("indexPolicy")}</MessageBar>
)}

View File

@@ -14,6 +14,7 @@ import * as ViewModels from "../../../../Contracts/ViewModels";
import { handleError } from "Common/ErrorHandlingUtils";
import { cancelDataTransferJob, pollDataTransferJob } from "Common/dataAccess/dataTransfers";
import { Platform, configContext } from "ConfigContext";
import Explorer from "Explorer/Explorer";
import { ChangePartitionKeyPane } from "Explorer/Panes/ChangePartitionKeyPane/ChangePartitionKeyPane";
import {
@@ -177,12 +178,14 @@ export const PartitionKeyComponent: React.FC<PartitionKeyComponentProps> = ({ da
To change the partition key, a new destination container must be created or an existing destination container
selected. Data will then be copied to the destination container.
</Text>
<PrimaryButton
styles={{ root: { width: "fit-content" } }}
text="Change"
onClick={startPartitionkeyChangeWorkflow}
disabled={isCurrentJobInProgress(portalDataTransferJob)}
/>
{configContext.platform !== Platform.Emulator && (
<PrimaryButton
styles={{ root: { width: "fit-content" } }}
text="Change"
onClick={startPartitionkeyChangeWorkflow}
disabled={isCurrentJobInProgress(portalDataTransferJob)}
/>
)}
{portalDataTransferJob && (
<Stack>
<Text styles={textHeadingStyle}>{partitionKeyName} change job</Text>

View File

@@ -0,0 +1,177 @@
import "@testing-library/jest-dom";
import { fireEvent, render, screen } from "@testing-library/react";
import React from "react";
import { ThroughputBucketsComponent } from "./ThroughputBucketsComponent";
describe("ThroughputBucketsComponent", () => {
const mockOnBucketsChange = jest.fn();
const mockOnSaveableChange = jest.fn();
const defaultProps = {
currentBuckets: [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
],
throughputBucketsBaseline: [
{ id: 1, maxThroughputPercentage: 40 },
{ id: 2, maxThroughputPercentage: 50 },
],
onBucketsChange: mockOnBucketsChange,
onSaveableChange: mockOnSaveableChange,
};
beforeEach(() => {
jest.clearAllMocks();
});
it("renders the correct number of buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
});
it("renders buckets in the correct order even if input is unordered", () => {
const unorderedBuckets = [
{ id: 2, maxThroughputPercentage: 60 },
{ id: 1, maxThroughputPercentage: 50 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={unorderedBuckets} />);
const bucketLabels = screen.getAllByText(/Group \d+/).map((el) => el.textContent);
expect(bucketLabels).toEqual(["Group 1 (Data Explorer Query Bucket)", "Group 2", "Group 3", "Group 4", "Group 5"]);
});
it("renders all provided buckets even if they exceed the max default bucket count", () => {
const oversizedBuckets = [
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 70 },
{ id: 4, maxThroughputPercentage: 80 },
{ id: 5, maxThroughputPercentage: 90 },
{ id: 6, maxThroughputPercentage: 100 },
{ id: 7, maxThroughputPercentage: 40 },
];
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={oversizedBuckets} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(7);
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
expect(screen.getByDisplayValue("60")).toBeInTheDocument();
expect(screen.getByDisplayValue("70")).toBeInTheDocument();
expect(screen.getByDisplayValue("80")).toBeInTheDocument();
expect(screen.getByDisplayValue("90")).toBeInTheDocument();
expect(screen.getByDisplayValue("100")).toBeInTheDocument();
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
});
it("calls onBucketsChange when a bucket value changes", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "70" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("triggers onSaveableChange when values change", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "80" } });
expect(mockOnSaveableChange).toHaveBeenCalledWith(true);
});
it("updates state consistently after multiple changes to different buckets", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
const input2 = screen.getByDisplayValue("60");
fireEvent.change(input2, { target: { value: "80" } });
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 70 },
{ id: 2, maxThroughputPercentage: 80 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("resets to baseline when currentBuckets are reset", () => {
const { rerender } = render(<ThroughputBucketsComponent {...defaultProps} />);
const input1 = screen.getByDisplayValue("50");
fireEvent.change(input1, { target: { value: "70" } });
rerender(<ThroughputBucketsComponent {...defaultProps} currentBuckets={defaultProps.throughputBucketsBaseline} />);
expect(screen.getByDisplayValue("40")).toBeInTheDocument();
expect(screen.getByDisplayValue("50")).toBeInTheDocument();
});
it("does not call onBucketsChange when value remains unchanged", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const input = screen.getByDisplayValue("50");
fireEvent.change(input, { target: { value: "50" } });
expect(mockOnBucketsChange).not.toHaveBeenCalled();
});
it("disables input and slider when maxThroughputPercentage is 100", () => {
render(
<ThroughputBucketsComponent
{...defaultProps}
currentBuckets={[
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 50 },
]}
/>,
);
const disabledInputs = screen.getAllByDisplayValue("100");
expect(disabledInputs.length).toBeGreaterThan(0);
expect(disabledInputs[0]).toBeDisabled();
const sliders = screen.getAllByRole("slider");
expect(sliders.length).toBeGreaterThan(0);
expect(sliders[0]).toHaveAttribute("aria-disabled", "true");
expect(sliders[1]).toHaveAttribute("aria-disabled", "false");
});
it("toggles bucket value between 50 and 100 with switch", () => {
render(<ThroughputBucketsComponent {...defaultProps} />);
const toggles = screen.getAllByRole("switch");
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 100 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
fireEvent.click(toggles[0]);
expect(mockOnBucketsChange).toHaveBeenCalledWith([
{ id: 1, maxThroughputPercentage: 50 },
{ id: 2, maxThroughputPercentage: 60 },
{ id: 3, maxThroughputPercentage: 100 },
{ id: 4, maxThroughputPercentage: 100 },
{ id: 5, maxThroughputPercentage: 100 },
]);
});
it("ensures default buckets are used when no buckets are provided", () => {
render(<ThroughputBucketsComponent {...defaultProps} currentBuckets={[]} />);
expect(screen.getAllByText(/Group \d+/)).toHaveLength(5);
expect(screen.getAllByDisplayValue("100")).toHaveLength(5);
});
});

View File

@@ -0,0 +1,105 @@
import { Label, Slider, Stack, TextField, Toggle } from "@fluentui/react";
import { ThroughputBucket } from "Contracts/DataModels";
import React, { FC, useEffect, useState } from "react";
import { isDirty } from "../../SettingsUtils";
const MAX_BUCKET_SIZES = 5;
const DEFAULT_BUCKETS = Array.from({ length: MAX_BUCKET_SIZES }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
export interface ThroughputBucketsComponentProps {
currentBuckets: ThroughputBucket[];
throughputBucketsBaseline: ThroughputBucket[];
onBucketsChange: (updatedBuckets: ThroughputBucket[]) => void;
onSaveableChange: (isSaveable: boolean) => void;
}
export const ThroughputBucketsComponent: FC<ThroughputBucketsComponentProps> = ({
currentBuckets,
throughputBucketsBaseline,
onBucketsChange,
onSaveableChange,
}) => {
const getThroughputBuckets = (buckets: ThroughputBucket[]): ThroughputBucket[] => {
if (!buckets || buckets.length === 0) {
return DEFAULT_BUCKETS;
}
const maxBuckets = Math.max(DEFAULT_BUCKETS.length, buckets.length);
const adjustedDefaultBuckets = Array.from({ length: maxBuckets }, (_, i) => ({
id: i + 1,
maxThroughputPercentage: 100,
}));
return adjustedDefaultBuckets.map(
(defaultBucket) => buckets?.find((bucket) => bucket.id === defaultBucket.id) || defaultBucket,
);
};
const [throughputBuckets, setThroughputBuckets] = useState<ThroughputBucket[]>(getThroughputBuckets(currentBuckets));
useEffect(() => {
setThroughputBuckets(getThroughputBuckets(currentBuckets));
onSaveableChange(false);
}, [currentBuckets]);
useEffect(() => {
const isChanged = isDirty(throughputBuckets, getThroughputBuckets(throughputBucketsBaseline));
onSaveableChange(isChanged);
}, [throughputBuckets]);
const handleBucketChange = (id: number, newValue: number) => {
const updatedBuckets = throughputBuckets.map((bucket) =>
bucket.id === id ? { ...bucket, maxThroughputPercentage: newValue } : bucket,
);
setThroughputBuckets(updatedBuckets);
const settingsChanged = isDirty(updatedBuckets, throughputBuckets);
settingsChanged && onBucketsChange(updatedBuckets);
};
const onToggle = (id: number, checked: boolean) => {
handleBucketChange(id, checked ? 50 : 100);
};
return (
<Stack tokens={{ childrenGap: "m" }} styles={{ root: { width: "70%", maxWidth: 700 } }}>
<Label>Throughput Buckets</Label>
<Stack>
{throughputBuckets?.map((bucket) => (
<Stack key={bucket.id} horizontal tokens={{ childrenGap: 8 }} verticalAlign="center">
<Slider
min={1}
max={100}
step={1}
value={bucket.maxThroughputPercentage}
onChange={(newValue) => handleBucketChange(bucket.id, newValue)}
showValue={false}
label={`Group ${bucket.id}${bucket.id === 1 ? " (Data Explorer Query Bucket)" : ""}`}
styles={{ root: { flex: 2, maxWidth: 400 } }}
disabled={bucket.maxThroughputPercentage === 100}
/>
<TextField
value={bucket.maxThroughputPercentage.toString()}
onChange={(event, newValue) => handleBucketChange(bucket.id, parseInt(newValue || "0", 10))}
type="number"
suffix="%"
styles={{
fieldGroup: { width: 80 },
}}
disabled={bucket.maxThroughputPercentage === 100}
/>
<Toggle
onText="Active"
offText="Inactive"
checked={bucket.maxThroughputPercentage !== 100}
onChange={(event, checked) => onToggle(bucket.id, checked)}
styles={{ root: { marginBottom: 0 }, text: { fontSize: 12 } }}
></Toggle>
</Stack>
))}
</Stack>
</Stack>
);
};

View File

@@ -17,14 +17,13 @@ import {
} from "@fluentui/react";
import React from "react";
import * as DataModels from "../../../../../Contracts/DataModels";
import { SubscriptionType } from "../../../../../Contracts/SubscriptionType";
import * as SharedConstants from "../../../../../Shared/Constants";
import { Action, ActionModifiers } from "../../../../../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../../../../../Shared/Telemetry/TelemetryProcessor";
import { userContext } from "../../../../../UserContext";
import * as AutoPilotUtils from "../../../../../Utils/AutoPilotUtils";
import { autoPilotThroughput1K } from "../../../../../Utils/AutoPilotUtils";
import { calculateEstimateNumber, usageInGB } from "../../../../../Utils/PricingUtils";
import { calculateEstimateNumber } from "../../../../../Utils/PricingUtils";
import { Int32 } from "../../../../Panes/Tables/Validators/EntityPropertyValidationCommon";
import {
PriceBreakdown,
@@ -366,29 +365,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
});
};
private minRUperGBSurvey = (): JSX.Element => {
const href = `https://ncv.microsoft.com/vRBTO37jmO?ctx={"AzureSubscriptionId":"${userContext.subscriptionId}","CosmosDBAccountName":"${userContext.databaseAccount?.name}"}`;
const oneTBinKB = 1000000000;
const minRUperGB = 10;
const featureFlagEnabled = userContext.features.showMinRUSurvey;
const collectionIsEligible =
userContext.subscriptionType !== SubscriptionType.Internal &&
this.props.usageSizeInKB > oneTBinKB &&
this.props.minimum >= usageInGB(this.props.usageSizeInKB) * minRUperGB;
if (featureFlagEnabled || collectionIsEligible) {
return (
<Text>
Need to scale below {this.props.minimum} RU/s? Reach out by filling{" "}
<a target="_blank" rel="noreferrer" href={href}>
this questionnaire
</a>
.
</Text>
);
}
return undefined;
};
private renderThroughputModeChoices = (): JSX.Element => {
const labelId = "settingsV2RadioButtonLabelId";
return (
@@ -661,7 +637,6 @@ export class ThroughputInputAutoPilotV3Component extends React.Component<
</Link>
</Text>
)}
{this.minRUperGBSurvey()}
{this.props.spendAckVisible && (
<Checkbox
id="spendAckCheckBox"

View File

@@ -4,7 +4,15 @@ import * as ViewModels from "../../../Contracts/ViewModels";
import { MongoIndex } from "../../../Utils/arm/generatedClients/cosmos/types";
const zeroValue = 0;
export type isDirtyTypes = boolean | string | number | DataModels.IndexingPolicy | DataModels.ComputedProperties;
export type isDirtyTypes =
| boolean
| string
| number
| DataModels.IndexingPolicy
| DataModels.ComputedProperties
| DataModels.VectorEmbedding[]
| DataModels.FullTextPolicy
| DataModels.ThroughputBucket[];
export const TtlOff = "off";
export const TtlOn = "on";
export const TtlOnNoDefault = "on-nodefault";
@@ -48,6 +56,12 @@ export enum SettingsV2TabTypes {
PartitionKeyTab,
ComputedPropertiesTab,
ContainerVectorPolicyTab,
ThroughputBucketsTab,
}
export enum ContainerPolicyTabTypes {
VectorPolicyTab,
FullTextPolicyTab,
}
export interface IsComponentDirtyResult {
@@ -154,7 +168,9 @@ export const getTabTitle = (tab: SettingsV2TabTypes): string => {
case SettingsV2TabTypes.ComputedPropertiesTab:
return "Computed Properties";
case SettingsV2TabTypes.ContainerVectorPolicyTab:
return "Container Vector Policy (preview)";
return "Container Policies";
case SettingsV2TabTypes.ThroughputBucketsTab:
return "Throughput Buckets";
default:
throw new Error(`Unknown tab ${tab}`);
}

View File

@@ -46,6 +46,8 @@ export const collection = {
query: "query",
},
]),
vectorEmbeddingPolicy: ko.observable<DataModels.VectorEmbeddingPolicy>({} as DataModels.VectorEmbeddingPolicy),
fullTextPolicy: ko.observable<DataModels.FullTextPolicy>({} as DataModels.FullTextPolicy),
readSettings: () => {
return;
},

View File

@@ -55,6 +55,7 @@ exports[`SettingsComponent renders 1`] = `
},
"databaseId": "test",
"defaultTtl": [Function],
"fullTextPolicy": [Function],
"geospatialConfig": [Function],
"getDatabase": [Function],
"id": [Function],
@@ -71,6 +72,7 @@ exports[`SettingsComponent renders 1`] = `
"readSettings": [Function],
"uniqueKeyPolicy": {},
"usageSizeInKB": [Function],
"vectorEmbeddingPolicy": [Function],
}
}
isAutoPilotSelected={false}
@@ -132,6 +134,7 @@ exports[`SettingsComponent renders 1`] = `
},
"databaseId": "test",
"defaultTtl": [Function],
"fullTextPolicy": [Function],
"geospatialConfig": [Function],
"getDatabase": [Function],
"id": [Function],
@@ -148,6 +151,7 @@ exports[`SettingsComponent renders 1`] = `
"readSettings": [Function],
"uniqueKeyPolicy": {},
"usageSizeInKB": [Function],
"vectorEmbeddingPolicy": [Function],
}
}
displayedTtlSeconds="5"
@@ -249,6 +253,7 @@ exports[`SettingsComponent renders 1`] = `
},
"databaseId": "test",
"defaultTtl": [Function],
"fullTextPolicy": [Function],
"geospatialConfig": [Function],
"getDatabase": [Function],
"id": [Function],
@@ -265,6 +270,7 @@ exports[`SettingsComponent renders 1`] = `
"readSettings": [Function],
"uniqueKeyPolicy": {},
"usageSizeInKB": [Function],
"vectorEmbeddingPolicy": [Function],
}
}
explorer={

View File

@@ -1,4 +1,5 @@
import { Checkbox, DirectionalHint, Link, Stack, Text, TextField, TooltipHost } from "@fluentui/react";
import { getWorkloadType } from "Common/DatabaseAccountUtility";
import { useDatabases } from "Explorer/useDatabases";
import React, { FunctionComponent, useEffect, useState } from "react";
import * as Constants from "../../../Common/Constants";
@@ -34,10 +35,15 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
setIsThroughputCapExceeded,
onCostAcknowledgeChange,
}: ThroughputInputProps) => {
const defaultThroughput: number =
isFreeTier ||
isQuickstart ||
[Constants.WorkloadType.Learning, Constants.WorkloadType.DevelopmentTesting].includes(getWorkloadType())
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
const [isAutoscaleSelected, setIsAutoScaleSelected] = useState<boolean>(true);
const [throughput, setThroughput] = useState<number>(
isFreeTier || isQuickstart ? AutoPilotUtils.autoPilotThroughput1K : AutoPilotUtils.autoPilotThroughput4K,
);
const [throughput, setThroughput] = useState<number>(defaultThroughput);
const [isCostAcknowledged, setIsCostAcknowledged] = useState<boolean>(false);
const [throughputError, setThroughputError] = useState<string>("");
const [totalThroughputUsed, setTotalThroughputUsed] = useState<number>(0);
@@ -47,7 +53,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const throughputCap = userContext.databaseAccount?.properties.capacity?.totalThroughputLimit;
const numberOfRegions = userContext.databaseAccount?.properties.locations?.length || 1;
useEffect(() => {
// throughput cap check for the initial state
let totalThroughput = 0;
@@ -157,9 +162,6 @@ export const ThroughputInput: FunctionComponent<ThroughputInputProps> = ({
const handleOnChangeMode = (event: React.ChangeEvent<HTMLInputElement>, mode: string): void => {
if (mode === "Autoscale") {
const defaultThroughput = isFreeTier
? AutoPilotUtils.autoPilotThroughput1K
: AutoPilotUtils.autoPilotThroughput4K;
setThroughput(defaultThroughput);
setIsAutoScaleSelected(true);
setThroughputValue(defaultThroughput);

View File

@@ -23,7 +23,7 @@ import { useCallback } from "react";
export interface TreeNodeMenuItem {
label: string;
onClick: () => void;
onClick: (value?: React.RefObject<HTMLElement>) => void;
iconSrc?: string;
isDisabled?: boolean;
styleClass?: string;
@@ -74,6 +74,7 @@ export const TreeNodeComponent: React.FC<TreeNodeComponentProps> = ({
openItems,
}: TreeNodeComponentProps): JSX.Element => {
const [isLoading, setIsLoading] = React.useState<boolean>(false);
const contextMenuRef = React.useRef<HTMLButtonElement>(null);
const treeStyles = useTreeStyles();
const getSortedChildren = (treeNode: TreeNode): TreeNode[] => {
@@ -141,7 +142,7 @@ export const TreeNodeComponent: React.FC<TreeNodeComponentProps> = ({
data-test={`TreeNode/ContextMenuItem:${menuItem.label}`}
disabled={menuItem.isDisabled}
key={menuItem.label}
onClick={menuItem.onClick}
onClick={() => menuItem.onClick(contextMenuRef)}
>
{menuItem.label}
</MenuItem>
@@ -190,6 +191,7 @@ export const TreeNodeComponent: React.FC<TreeNodeComponentProps> = ({
className={mergeClasses(treeStyles.actionsButton, shouldShowAsSelected && treeStyles.selectedItem)}
data-test="TreeNode/ContextMenuTrigger"
appearance="subtle"
ref={contextMenuRef}
icon={<MoreHorizontal20Regular />}
/>
</MenuTrigger>

View File

@@ -1478,14 +1478,14 @@ exports[`TreeNodeComponent renders a node with a menu 1`] = `
<MenuList>
<MenuItem
data-test="TreeNode/ContextMenuItem:enabledItem"
onClick={[MockFunction enabledItemClick]}
onClick={[Function]}
>
enabledItem
</MenuItem>
<MenuItem
data-test="TreeNode/ContextMenuItem:disabledItem"
disabled={true}
onClick={[MockFunction disabledItemClick]}
onClick={[Function]}
>
disabledItem
</MenuItem>
@@ -1518,7 +1518,7 @@ exports[`TreeNodeComponent renders a node with a menu 1`] = `
<MenuItem
data-test="TreeNode/ContextMenuItem:enabledItem"
key="enabledItem"
onClick={[MockFunction enabledItemClick]}
onClick={[Function]}
>
enabledItem
</MenuItem>
@@ -1526,7 +1526,7 @@ exports[`TreeNodeComponent renders a node with a menu 1`] = `
data-test="TreeNode/ContextMenuItem:disabledItem"
disabled={true}
key="disabledItem"
onClick={[MockFunction disabledItemClick]}
onClick={[Function]}
>
disabledItem
</MenuItem>

View File

@@ -2,7 +2,7 @@ import "@testing-library/jest-dom";
import { RenderResult, fireEvent, render, screen, waitFor } from "@testing-library/react";
import { VectorEmbedding, VectorIndex } from "Contracts/DataModels";
import React from "react";
import { AddVectorEmbeddingPolicyForm } from "./AddVectorEmbeddingPolicyForm";
import { VectorEmbeddingPoliciesComponent } from "./VectorEmbeddingPoliciesComponent";
const mockVectorEmbedding: VectorEmbedding[] = [
{ path: "/vector1", dataType: "float32", distanceFunction: "euclidean", dimensions: 0 },
@@ -17,9 +17,9 @@ describe("AddVectorEmbeddingPolicyForm", () => {
beforeEach(() => {
component = render(
<AddVectorEmbeddingPolicyForm
vectorEmbedding={mockVectorEmbedding}
vectorIndex={mockVectorIndex}
<VectorEmbeddingPoliciesComponent
vectorEmbeddings={mockVectorEmbedding}
vectorIndexes={mockVectorIndex}
onVectorEmbeddingChange={mockOnVectorEmbeddingChange}
/>,
);
@@ -36,7 +36,7 @@ describe("AddVectorEmbeddingPolicyForm", () => {
});
test("calls onDelete when delete button is clicked", async () => {
const deleteButton = component.container.querySelector("#delete-vector-policy-1");
const deleteButton = component.container.querySelector("#delete-Vector-embedding-1");
fireEvent.click(deleteButton);
expect(mockOnVectorEmbeddingChange).toHaveBeenCalled();
expect(screen.queryByText("Vector embedding 1")).toBeNull();
@@ -49,21 +49,19 @@ describe("AddVectorEmbeddingPolicyForm", () => {
test("validates input correctly", async () => {
fireEvent.change(screen.getByPlaceholderText("/vector1"), { target: { value: "" } });
await waitFor(() => expect(screen.getByText("Vector embedding path should not be empty")).toBeInTheDocument(), {
await waitFor(() => expect(screen.getByText("Path should not be empty")).toBeInTheDocument(), {
timeout: 1500,
});
await waitFor(
() =>
expect(
screen.getByText("Vector embedding dimension must be greater than 0 and less than or equal 4096"),
).toBeInTheDocument(),
expect(screen.getByText("Dimension must be greater than 0 and less than or equal 4096")).toBeInTheDocument(),
{
timeout: 1500,
},
);
fireEvent.change(component.container.querySelector("#vector-policy-dimension-1"), { target: { value: "4096" } });
fireEvent.change(screen.getByPlaceholderText("/vector1"), { target: { value: "/vector1" } });
await waitFor(() => expect(screen.queryByText("Vector embedding path should not be empty")).toBeNull(), {
await waitFor(() => expect(screen.queryByText("Path should not be empty")).toBeNull(), {
timeout: 1500,
});
await waitFor(

View File

@@ -0,0 +1,470 @@
import {
DefaultButton,
Dropdown,
IDropdownOption,
IStyleFunctionOrObject,
ITextFieldStyleProps,
ITextFieldStyles,
Label,
Stack,
TextField,
} from "@fluentui/react";
import { VectorEmbedding, VectorIndex } from "Contracts/DataModels";
import { CollapsibleSectionComponent } from "Explorer/Controls/CollapsiblePanel/CollapsibleSectionComponent";
import {
getDataTypeOptions,
getDistanceFunctionOptions,
getIndexTypeOptions,
} from "Explorer/Controls/VectorSearch/VectorSearchUtils";
import React, { FunctionComponent, useState } from "react";
export interface IVectorEmbeddingPoliciesComponentProps {
vectorEmbeddings: VectorEmbedding[];
onVectorEmbeddingChange: (
vectorEmbeddings: VectorEmbedding[],
vectorIndexingPolicies: VectorIndex[],
validationPassed: boolean,
) => void;
vectorIndexes?: VectorIndex[];
discardChanges?: boolean;
onChangesDiscarded?: () => void;
disabled?: boolean;
}
export interface VectorEmbeddingPolicyData {
path: string;
dataType: VectorEmbedding["dataType"];
distanceFunction: VectorEmbedding["distanceFunction"];
dimensions: number;
indexType: VectorIndex["type"] | "none";
pathError: string;
dimensionsError: string;
diskANNShardKey?: string;
diskANNShardKeyError?: string;
indexingSearchListSize?: number;
indexingSearchListSizeError?: string;
quantizationByteSize?: number;
quantizationByteSizeError?: string;
}
type VectorEmbeddingPolicyProperty = "dataType" | "distanceFunction" | "indexType";
const labelStyles = {
root: {
fontSize: 12,
},
};
const textFieldStyles: IStyleFunctionOrObject<ITextFieldStyleProps, ITextFieldStyles> = {
fieldGroup: {
height: 27,
},
field: {
fontSize: 12,
padding: "0 8px",
},
};
const dropdownStyles = {
title: {
height: 27,
lineHeight: "24px",
fontSize: 12,
},
dropdown: {
height: 27,
lineHeight: "24px",
},
dropdownItem: {
fontSize: 12,
},
};
export const VectorEmbeddingPoliciesComponent: FunctionComponent<IVectorEmbeddingPoliciesComponentProps> = ({
vectorEmbeddings,
vectorIndexes,
onVectorEmbeddingChange,
discardChanges,
onChangesDiscarded,
disabled,
}): JSX.Element => {
const onVectorEmbeddingPathError = (path: string, index?: number): string => {
let error = "";
if (!path) {
error = "Path should not be empty";
}
if (
index >= 0 &&
vectorEmbeddingPolicyData?.find(
(vectorEmbedding: VectorEmbeddingPolicyData, dataIndex: number) =>
dataIndex !== index && vectorEmbedding.path === path,
)
) {
error = "Path is already defined";
}
return error;
};
const onVectorEmbeddingDimensionError = (dimension: number, indexType: VectorIndex["type"] | "none"): string => {
let error = "";
if (dimension <= 0 || dimension > 4096) {
error = "Dimension must be greater than 0 and less than or equal 4096";
}
if (indexType === "flat" && dimension > 505) {
error = "Maximum allowed dimension for flat index is 505";
}
return error;
};
const onQuantizationByteSizeError = (size: number): string => {
let error = "";
if (size < 1 || size > 512) {
error = "Quantization byte size must be greater than 0 and less than or equal to 512";
}
return error;
};
const onIndexingSearchListSizeError = (size: number): string => {
let error = "";
if (size < 25 || size > 500) {
error = "Indexing search list size must be greater than or equal to 25 and less than or equal to 500";
}
return error;
};
//TODO: no restrictions yet due to this field being removed for now.
// Uncomment and replace with validation code when field is reinstated
// const onDiskANNShardKeyError = (shardKey: string): string => {
// return "";
// };
const initializeData = (vectorEmbeddings: VectorEmbedding[], vectorIndexes: VectorIndex[]) => {
const mergedData: VectorEmbeddingPolicyData[] = [];
vectorEmbeddings.forEach((embedding) => {
const matchingIndex = displayIndexes ? vectorIndexes.find((index) => index.path === embedding.path) : undefined;
mergedData.push({
...embedding,
indexType: matchingIndex?.type || "none",
indexingSearchListSize: matchingIndex?.indexingSearchListSize || undefined,
quantizationByteSize: matchingIndex?.quantizationByteSize || undefined,
pathError: onVectorEmbeddingPathError(embedding.path),
dimensionsError: onVectorEmbeddingDimensionError(embedding.dimensions, matchingIndex?.type || "none"),
});
});
return mergedData;
};
const [displayIndexes] = useState<boolean>(!!vectorIndexes);
const [vectorEmbeddingPolicyData, setVectorEmbeddingPolicyData] = useState<VectorEmbeddingPolicyData[]>(
initializeData(vectorEmbeddings, vectorIndexes),
);
React.useEffect(() => {
propagateData();
}, [vectorEmbeddingPolicyData]);
React.useEffect(() => {
if (discardChanges) {
setVectorEmbeddingPolicyData(initializeData(vectorEmbeddings, vectorIndexes));
onChangesDiscarded();
}
}, [discardChanges]);
const propagateData = () => {
const vectorEmbeddings: VectorEmbedding[] = vectorEmbeddingPolicyData.map((policy: VectorEmbeddingPolicyData) => ({
path: policy.path,
dataType: policy.dataType,
dimensions: policy.dimensions,
distanceFunction: policy.distanceFunction,
}));
const vectorIndexes: VectorIndex[] = vectorEmbeddingPolicyData
.filter((policy: VectorEmbeddingPolicyData) => policy.indexType !== "none")
.map(
(policy) =>
({
path: policy.path,
type: policy.indexType,
indexingSearchListSize: policy.indexingSearchListSize,
quantizationByteSize: policy.quantizationByteSize,
}) as VectorIndex,
);
const validationPassed = vectorEmbeddingPolicyData.every(
(policy: VectorEmbeddingPolicyData) => policy.pathError === "" && policy.dimensionsError === "",
);
onVectorEmbeddingChange(vectorEmbeddings, vectorIndexes, validationPassed);
};
const onVectorEmbeddingPathChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = event.target.value.trim();
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
if (!vectorEmbeddings[index]?.path && !value.startsWith("/")) {
vectorEmbeddings[index].path = "/" + value;
} else {
vectorEmbeddings[index].path = value;
}
const error = onVectorEmbeddingPathError(value, index);
vectorEmbeddings[index].pathError = error;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onVectorEmbeddingDimensionsChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = parseInt(event.target.value.trim()) || 0;
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
const vectorEmbedding = vectorEmbeddings[index];
vectorEmbeddings[index].dimensions = value;
const error = onVectorEmbeddingDimensionError(value, vectorEmbedding.indexType);
vectorEmbeddings[index].dimensionsError = error;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onVectorEmbeddingIndexTypeChange = (index: number, option: IDropdownOption): void => {
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
const vectorEmbedding = vectorEmbeddings[index];
vectorEmbeddings[index].indexType = option.key as never;
const error = onVectorEmbeddingDimensionError(vectorEmbedding.dimensions, vectorEmbedding.indexType);
vectorEmbeddings[index].dimensionsError = error;
if (vectorEmbedding.indexType === "diskANN") {
vectorEmbedding.indexingSearchListSize = 100;
} else {
vectorEmbedding.indexingSearchListSize = undefined;
}
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onQuantizationByteSizeChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = parseInt(event.target.value.trim()) || 0;
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
vectorEmbeddings[index].quantizationByteSize = value;
vectorEmbeddings[index].quantizationByteSizeError = onQuantizationByteSizeError(value);
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onIndexingSearchListSizeChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = parseInt(event.target.value.trim()) || 0;
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
vectorEmbeddings[index].indexingSearchListSize = value;
vectorEmbeddings[index].indexingSearchListSizeError = onIndexingSearchListSizeError(value);
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
// TODO: uncomment after Ignite
// DiskANNShardKey was removed for Ignite due to backend problems. Leaving this here as it will be reinstated immediately after Ignite
// const onDiskANNShardKeyChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
// const value = event.target.value.trim();
// const vectorEmbeddings = [...vectorEmbeddingPolicyData];
// if (!vectorEmbeddings[index]?.diskANNShardKey && !value.startsWith("/")) {
// vectorEmbeddings[index].diskANNShardKey = "/" + value;
// } else {
// vectorEmbeddings[index].diskANNShardKey = value;
// }
// const error = onDiskANNShardKeyError(value);
// vectorEmbeddings[index].diskANNShardKeyError = error;
// setVectorEmbeddingPolicyData(vectorEmbeddings);
// }
const onVectorEmbeddingPolicyChange = (
index: number,
option: IDropdownOption,
property: VectorEmbeddingPolicyProperty,
): void => {
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
vectorEmbeddings[index][property] = option.key as never;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onAdd = () => {
setVectorEmbeddingPolicyData([
...vectorEmbeddingPolicyData,
{
path: "",
dataType: "float32",
distanceFunction: "euclidean",
dimensions: 0,
indexType: "none",
pathError: onVectorEmbeddingPathError(""),
dimensionsError: onVectorEmbeddingDimensionError(0, "none"),
},
]);
};
const onDelete = (index: number) => {
const vectorEmbeddings = vectorEmbeddingPolicyData.filter((_uniqueKey, j) => index !== j);
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
return (
<Stack tokens={{ childrenGap: 4 }}>
{vectorEmbeddingPolicyData &&
vectorEmbeddingPolicyData.length > 0 &&
vectorEmbeddingPolicyData.map((vectorEmbeddingPolicy: VectorEmbeddingPolicyData, index: number) => (
<CollapsibleSectionComponent
disabled={disabled}
key={index}
isExpandedByDefault={true}
title={`Vector embedding ${index + 1}`}
showDelete={true}
onDelete={() => onDelete(index)}
>
<Stack horizontal tokens={{ childrenGap: 4 }}>
<Stack
styles={{
root: {
margin: "0 0 6px 20px !important",
paddingLeft: 20,
width: "80%",
borderLeft: "1px solid",
},
}}
>
<Stack>
<Label disabled={disabled} styles={labelStyles}>
Path
</Label>
<TextField
disabled={disabled}
id={`vector-policy-path-${index + 1}`}
required={true}
placeholder="/vector1"
styles={textFieldStyles}
onChange={(event: React.ChangeEvent<HTMLInputElement>) => onVectorEmbeddingPathChange(index, event)}
value={vectorEmbeddingPolicy.path || ""}
errorMessage={vectorEmbeddingPolicy.pathError}
/>
</Stack>
<Stack>
<Label disabled={disabled} styles={labelStyles}>
Data type
</Label>
<Dropdown
disabled={disabled}
required={true}
styles={dropdownStyles}
options={getDataTypeOptions()}
selectedKey={vectorEmbeddingPolicy.dataType}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingPolicyChange(index, option, "dataType")
}
></Dropdown>
</Stack>
<Stack>
<Label disabled={disabled} styles={labelStyles}>
Distance function
</Label>
<Dropdown
disabled={disabled}
required={true}
styles={dropdownStyles}
options={getDistanceFunctionOptions()}
selectedKey={vectorEmbeddingPolicy.distanceFunction}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingPolicyChange(index, option, "distanceFunction")
}
></Dropdown>
</Stack>
<Stack>
<Label disabled={disabled} styles={labelStyles}>
Dimensions
</Label>
<TextField
disabled={disabled}
id={`vector-policy-dimension-${index + 1}`}
required={true}
styles={textFieldStyles}
value={String(vectorEmbeddingPolicy.dimensions || 0)}
onChange={(event: React.ChangeEvent<HTMLInputElement>) =>
onVectorEmbeddingDimensionsChange(index, event)
}
errorMessage={vectorEmbeddingPolicy.dimensionsError}
/>
</Stack>
{displayIndexes && (
<Stack>
<Label disabled={disabled} styles={labelStyles}>
Index type
</Label>
<Dropdown
disabled={disabled}
required={true}
styles={dropdownStyles}
options={getIndexTypeOptions()}
selectedKey={vectorEmbeddingPolicy.indexType}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingIndexTypeChange(index, option)
}
></Dropdown>
<Stack style={{ marginLeft: "10px" }}>
<Label
disabled={
disabled ||
(vectorEmbeddingPolicy.indexType !== "quantizedFlat" &&
vectorEmbeddingPolicy.indexType !== "diskANN")
}
styles={labelStyles}
>
Quantization byte size
</Label>
<TextField
disabled={
disabled ||
(vectorEmbeddingPolicy.indexType !== "quantizedFlat" &&
vectorEmbeddingPolicy.indexType !== "diskANN")
}
id={`vector-policy-quantizationByteSize-${index + 1}`}
styles={textFieldStyles}
value={String(vectorEmbeddingPolicy.quantizationByteSize || "")}
onChange={(event: React.ChangeEvent<HTMLInputElement>) =>
onQuantizationByteSizeChange(index, event)
}
/>
</Stack>
<Stack style={{ marginLeft: "10px" }}>
<Label disabled={disabled || vectorEmbeddingPolicy.indexType !== "diskANN"} styles={labelStyles}>
Indexing search list size
</Label>
<TextField
disabled={disabled || vectorEmbeddingPolicy.indexType !== "diskANN"}
id={`vector-policy-indexingSearchListSize-${index + 1}`}
styles={textFieldStyles}
value={String(vectorEmbeddingPolicy.indexingSearchListSize || "")}
onChange={(event: React.ChangeEvent<HTMLInputElement>) =>
onIndexingSearchListSizeChange(index, event)
}
/>
</Stack>
{/*TODO: uncomment after Ignite */}
{/* DiskANNShardKey was removed for Ignite due to backend problems. Leaving this here as it will be reinstated immediately after Ignite
<Stack
style={{ marginLeft: "10px" }}
>
<Label
disabled={disabled || vectorEmbeddingPolicy.indexType !== "diskANN"}
styles={labelStyles}
>DiskANN shard key</Label>
<TextField
disabled={disabled || vectorEmbeddingPolicy.indexType !== "diskANN"}
id={`vector-policy-diskANNShardKey-${index + 1}`}
styles={textFieldStyles}
value={String(vectorEmbeddingPolicy.diskANNShardKey || "")}
onChange={(event: React.ChangeEvent<HTMLInputElement>) =>
onDiskANNShardKeyChange(index, event)
}
/>
</Stack>
*/}
</Stack>
)}
</Stack>
</Stack>
</CollapsibleSectionComponent>
))}
<DefaultButton
disabled={disabled}
id={`add-vector-policy`}
styles={{ root: { maxWidth: 170, fontSize: 12 } }}
onClick={onAdd}
>
Add vector embedding
</DefaultButton>
</Stack>
);
};

View File

@@ -10,7 +10,7 @@ import { getCopilotEnabled, isCopilotFeatureRegistered } from "Explorer/QueryCop
import { IGalleryItem } from "Juno/JunoClient";
import { scheduleRefreshDatabaseResourceToken } from "Platform/Fabric/FabricUtil";
import { LocalStorageUtility, StorageKey } from "Shared/StorageUtility";
import { acquireTokenWithMsal, getMsalInstance } from "Utils/AuthorizationUtils";
import { acquireMsalTokenForAccount } from "Utils/AuthorizationUtils";
import { allowedNotebookServerUrls, validateEndpoint } from "Utils/EndpointUtils";
import { update } from "Utils/arm/generatedClients/cosmos/databaseAccounts";
import { useQueryCopilot } from "hooks/useQueryCopilot";
@@ -35,7 +35,7 @@ import { PhoenixClient } from "../Phoenix/PhoenixClient";
import * as ExplorerSettings from "../Shared/ExplorerSettings";
import { Action, ActionModifiers } from "../Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "../Shared/Telemetry/TelemetryProcessor";
import { isAccountNewerThanThresholdInMs, updateUserContext, userContext } from "../UserContext";
import { updateUserContext, userContext } from "../UserContext";
import { getCollectionName, getUploadName } from "../Utils/APITypeUtils";
import { stringToBlob } from "../Utils/BlobUtils";
import { isCapabilityEnabled } from "../Utils/CapabilityUtils";
@@ -259,25 +259,8 @@ export default class Explorer {
public async openLoginForEntraIDPopUp(): Promise<void> {
if (userContext.databaseAccount.properties?.documentEndpoint) {
const hrefEndpoint = new URL(userContext.databaseAccount.properties.documentEndpoint).href.replace(
/\/$/,
"/.default",
);
const msalInstance = await getMsalInstance();
try {
const response = await msalInstance.loginPopup({
redirectUri: configContext.msalRedirectURI,
scopes: [],
});
localStorage.setItem("cachedTenantId", response.tenantId);
const cachedAccount = msalInstance.getAllAccounts()?.[0];
msalInstance.setActiveAccount(cachedAccount);
const aadToken = await acquireTokenWithMsal(msalInstance, {
forceRefresh: true,
scopes: [hrefEndpoint],
authority: `${configContext.AAD_ENDPOINT}${localStorage.getItem("cachedTenantId")}`,
});
const aadToken = await acquireMsalTokenForAccount(userContext.databaseAccount, false);
updateUserContext({ aadToken: aadToken });
useDataPlaneRbac.setState({ aadTokenUpdated: true });
} catch (error) {
@@ -295,37 +278,6 @@ export default class Explorer {
}
}
public openNPSSurveyDialog(): void {
if (!Platform.Portal || !["Postgres", "SQL", "Mongo"].includes(userContext.apiType)) {
return;
}
const ONE_DAY_IN_MS = 86400000;
const SEVEN_DAYS_IN_MS = 604800000;
// Try Cosmos DB subscription - survey shown to 100% of users at day 1 in Data Explorer.
if (userContext.isTryCosmosDBSubscription) {
if (isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", ONE_DAY_IN_MS)) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed in Try Cosmos DB ${userContext.apiType}`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
} else {
// Show survey when an existing account is older than 7 days
if (
!isAccountNewerThanThresholdInMs(userContext.databaseAccount?.systemData?.createdAt || "", SEVEN_DAYS_IN_MS)
) {
Logger.logInfo(
`Sending message to Portal to check if NPS Survey can be displayed for existing ${userContext.apiType} account older than 7 days`,
"Explorer/openNPSSurveyDialog",
);
sendMessage({ type: MessageTypes.DisplayNPSSurvey });
}
}
}
public async openCESCVAFeedbackBlade(): Promise<void> {
sendMessage({ type: MessageTypes.OpenCESCVAFeedbackBlade });
Logger.logInfo(
@@ -954,25 +906,28 @@ export default class Explorer {
}
public async openNotebookTerminal(kind: ViewModels.TerminalKind): Promise<void> {
if (useNotebook.getState().isPhoenixFeatures) {
await this.allocateContainer(PoolIdType.DefaultPoolId);
const notebookServerInfo = useNotebook.getState().notebookServerInfo;
if (notebookServerInfo && notebookServerInfo.notebookServerEndpoint !== undefined) {
this.connectToNotebookTerminal(kind);
} else {
useDialog
.getState()
.showOkModalDialog(
"Failed to connect",
"Failed to connect to temporary workspace. This could happen because of network issues. Please refresh the page and try again.",
);
}
if (userContext.features.enableCloudShell || !useNotebook.getState().isPhoenixFeatures) {
this.connectToTerminal(kind);
return;
}
await this.allocateContainer(PoolIdType.DefaultPoolId);
const notebookServerInfo = useNotebook.getState().notebookServerInfo;
if (notebookServerInfo?.notebookServerEndpoint) {
this.connectToTerminal(kind);
} else {
this.connectToNotebookTerminal(kind);
useDialog
.getState()
.showOkModalDialog(
"Failed to connect",
"Failed to connect to temporary workspace. This could happen because of network issues. Please refresh the page and try again."
);
}
}
private connectToNotebookTerminal(kind: ViewModels.TerminalKind): void {
private connectToTerminal(kind: ViewModels.TerminalKind): void {
let title: string;
switch (kind) {
@@ -1151,7 +1106,7 @@ export default class Explorer {
if (userContext.apiType !== "Postgres" && userContext.apiType !== "VCoreMongo") {
userContext.authType === AuthType.ResourceToken
? this.refreshDatabaseForResourceToken()
: this.refreshAllDatabases();
: await this.refreshAllDatabases(); // await: we rely on the databases to be loaded before restoring the tabs further in the flow
}
await useNotebook.getState().refreshNotebooksEnabledStateForAccount();
@@ -1175,7 +1130,7 @@ export default class Explorer {
await this.initNotebooks(userContext.databaseAccount);
}
await this.refreshSampleData();
this.refreshSampleData();
}
public async configureCopilot(): Promise<void> {
@@ -1200,26 +1155,27 @@ export default class Explorer {
.setCopilotSampleDBEnabled(copilotEnabled && copilotUserDBEnabled && copilotSampleDBEnabled);
}
public async refreshSampleData(): Promise<void> {
try {
if (!userContext.sampleDataConnectionInfo) {
return;
}
const collection: DataModels.Collection = await readSampleCollection();
if (!collection) {
return;
}
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
} catch (error) {
Logger.logError(getErrorMessage(error), "Explorer");
public refreshSampleData(): void {
if (!userContext.sampleDataConnectionInfo) {
return;
}
const databaseId = userContext.sampleDataConnectionInfo?.databaseId;
if (!databaseId) {
return;
}
readSampleCollection()
.then((collection: DataModels.Collection) => {
if (!collection) {
return;
}
const sampleDataResourceTokenCollection = new ResourceTokenCollection(this, databaseId, collection, true);
useDatabases.setState({ sampleDataResourceTokenCollection });
})
.catch((error) => {
Logger.logError(getErrorMessage(error), "Explorer/refreshSampleData");
});
}
}

View File

@@ -125,13 +125,13 @@ export function createContextCommandBarButtons(
const buttons: CommandButtonComponentProps[] = [];
if (!selectedNodeState.isDatabaseNodeOrNoneSelected() && userContext.apiType === "Mongo") {
const label = useNotebook.getState().isShellEnabled ? "Open Mongo Shell" : "New Shell";
const label = (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) ? "Open Mongo Shell" : "New Shell";
const newMongoShellBtn: CommandButtonComponentProps = {
iconSrc: HostedTerminalIcon,
iconAlt: label,
onCommandClick: () => {
const selectedCollection: ViewModels.Collection = selectedNodeState.findSelectedCollection();
if (useNotebook.getState().isShellEnabled) {
if (useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) {
container.openNotebookTerminal(ViewModels.TerminalKind.Mongo);
} else {
selectedCollection && selectedCollection.onNewMongoShellClick();
@@ -145,7 +145,7 @@ export function createContextCommandBarButtons(
}
if (
useNotebook.getState().isShellEnabled &&
(useNotebook.getState().isShellEnabled || userContext.features.enableCloudShell) &&
!selectedNodeState.isDatabaseNodeOrNoneSelected() &&
userContext.apiType === "Cassandra"
) {

View File

@@ -1,6 +1,5 @@
import { isPublicInternetAccessAllowed } from "Common/DatabaseAccountUtility";
import { PhoenixClient } from "Phoenix/PhoenixClient";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { cloneDeep } from "lodash";
import create, { UseStore } from "zustand";
import { AuthType } from "../../AuthType";
@@ -128,9 +127,7 @@ export const useNotebook: UseStore<NotebookState> = create((set, get) => ({
userContext.apiType === "Postgres" || userContext.apiType === "VCoreMongo"
? databaseAccount?.location
: databaseAccount?.properties?.writeLocations?.[0]?.locationName.toLowerCase();
const disallowedLocationsUri: string = useNewPortalBackendEndpoint(Constants.BackendApi.DisallowedLocations)
? `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`
: `${configContext.BACKEND_ENDPOINT}/api/disallowedLocations`;
const disallowedLocationsUri: string = `${configContext.PORTAL_BACKEND_ENDPOINT}/api/disallowedlocations`;
const authorizationHeader = getAuthorizationHeader();
try {
const response = await fetch(disallowedLocationsUri, {

View File

@@ -1,4 +1,5 @@
// TODO convert this file to an action registry in order to have actions and their handlers be more tightly coupled.
import { configContext, Platform } from "ConfigContext";
import { useDatabases } from "Explorer/useDatabases";
import React from "react";
import { ActionContracts } from "../../Contracts/ExplorerContracts";
@@ -56,6 +57,19 @@ function openCollectionTab(
continue;
}
if (
configContext.platform === Platform.Fabric &&
!(
// whitelist the tab kinds that are allowed to be opened in Fabric
(
action.tabKind === ActionContracts.TabKind.SQLDocuments ||
action.tabKind === ActionContracts.TabKind.SQLQuery
)
)
) {
continue;
}
//expand database first if not expanded to load the collections
if (!database.isDatabaseExpanded?.()) {
database.expandDatabase?.();
@@ -121,10 +135,28 @@ function openCollectionTab(
action.tabKind === ActionContracts.TabKind.SQLQuery ||
action.tabKind === ActionContracts.TabKind[ActionContracts.TabKind.SQLQuery]
) {
const openQueryTabAction = action as ActionContracts.OpenQueryTab;
collection.onNewQueryClick(
collection,
undefined,
generateQueryText(action as ActionContracts.OpenQueryTab, collection.partitionKeyProperties),
generateQueryText(openQueryTabAction, collection.partitionKeyProperties),
openQueryTabAction.splitterDirection,
openQueryTabAction.queryViewSizePercent,
);
break;
}
if (
action.tabKind === ActionContracts.TabKind.MongoQuery ||
action.tabKind === ActionContracts.TabKind[ActionContracts.TabKind.MongoQuery]
) {
const openQueryTabAction = action as ActionContracts.OpenQueryTab;
collection.onNewMongoQueryClick(
collection,
undefined,
generateQueryText(openQueryTabAction, collection.partitionKeyProperties),
openQueryTabAction.splitterDirection,
openQueryTabAction.queryViewSizePercent,
);
break;
}

View File

@@ -17,11 +17,15 @@ import {
} from "@fluentui/react";
import * as Constants from "Common/Constants";
import { createCollection } from "Common/dataAccess/createCollection";
import { getNewDatabaseSharedThroughputDefault } from "Common/DatabaseUtility";
import { getErrorMessage, getErrorStack } from "Common/ErrorHandlingUtils";
import { configContext, Platform } from "ConfigContext";
import * as DataModels from "Contracts/DataModels";
import { SubscriptionType } from "Contracts/SubscriptionType";
import { AddVectorEmbeddingPolicyForm } from "Explorer/Panes/VectorSearchPanel/AddVectorEmbeddingPolicyForm";
import {
FullTextPoliciesComponent,
getFullTextLanguageOptions,
} from "Explorer/Controls/FullTextSeach/FullTextPoliciesComponent";
import { VectorEmbeddingPoliciesComponent } from "Explorer/Controls/VectorSearch/VectorEmbeddingPoliciesComponent";
import { useSidePanel } from "hooks/useSidePanel";
import { useTeachingBubble } from "hooks/useTeachingBubble";
import React from "react";
@@ -30,7 +34,12 @@ import { Action } from "Shared/Telemetry/TelemetryConstants";
import * as TelemetryProcessor from "Shared/Telemetry/TelemetryProcessor";
import { userContext } from "UserContext";
import { getCollectionName } from "Utils/APITypeUtils";
import { isCapabilityEnabled, isServerlessAccount, isVectorSearchEnabled } from "Utils/CapabilityUtils";
import {
isCapabilityEnabled,
isFullTextSearchEnabled,
isServerlessAccount,
isVectorSearchEnabled,
} from "Utils/CapabilityUtils";
import { getUpsellMessage } from "Utils/PricingUtils";
import { CollapsibleSectionComponent } from "../Controls/CollapsiblePanel/CollapsibleSectionComponent";
import { ThroughputInput } from "../Controls/ThroughputInput/ThroughputInput";
@@ -109,6 +118,9 @@ export interface AddCollectionPanelState {
vectorIndexingPolicy: DataModels.VectorIndex[];
vectorEmbeddingPolicy: DataModels.VectorEmbedding[];
vectorPolicyValidated: boolean;
fullTextPolicy: DataModels.FullTextPolicy;
fullTextIndexes: DataModels.FullTextIndex[];
fullTextPolicyValidated: boolean;
}
export class AddCollectionPanel extends React.Component<AddCollectionPanelProps, AddCollectionPanelState> {
@@ -125,7 +137,7 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
createNewDatabase:
userContext.apiType !== "Tables" && configContext.platform !== Platform.Fabric && !this.props.databaseId,
newDatabaseId: props.isQuickstart ? this.getSampleDBName() : "",
isSharedThroughputChecked: this.getSharedThroughputDefault(),
isSharedThroughputChecked: getNewDatabaseSharedThroughputDefault(),
selectedDatabaseId:
userContext.apiType === "Tables" ? CollectionCreation.TablesAPIDefaultDatabase : this.props.databaseId,
collectionId: props.isQuickstart ? `Sample${getCollectionName()}` : "",
@@ -147,6 +159,9 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
vectorEmbeddingPolicy: [],
vectorIndexingPolicy: [],
vectorPolicyValidated: true,
fullTextPolicy: { defaultLanguage: getFullTextLanguageOptions()[0].key as never, fullTextPaths: [] },
fullTextIndexes: [],
fullTextPolicyValidated: true,
};
}
@@ -804,22 +819,9 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
{this.shouldShowAnalyticalStoreOptions() && (
<Stack className="panelGroupSpacing">
<Stack horizontal>
<Text className="panelTextBold" variant="small">
Analytical store
</Text>
<TooltipHost
directionalHint={DirectionalHint.bottomLeftEdge}
content={this.getAnalyticalStorageTooltipContent()}
>
<Icon
iconName="Info"
className="panelInfoIcon"
tabIndex={0}
ariaLabel="Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads."
/>
</TooltipHost>
</Stack>
<Text className="panelTextBold" variant="small">
{this.getAnalyticalStorageContent()}
</Text>
<Stack horizontal verticalAlign="center">
<div role="radiogroup">
@@ -863,6 +865,7 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
<Link
href="https://aka.ms/cosmosdb-synapselink"
target="_blank"
aria-label={Constants.ariaLabelForLearnMoreLink.AzureSynapseLink}
className="capacitycalculator-link"
>
Learn more
@@ -890,9 +893,9 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
>
<Stack id="collapsibleVectorPolicySectionContent" styles={{ root: { position: "relative" } }}>
<Stack styles={{ root: { paddingLeft: 40 } }}>
<AddVectorEmbeddingPolicyForm
vectorEmbedding={this.state.vectorEmbeddingPolicy}
vectorIndex={this.state.vectorIndexingPolicy}
<VectorEmbeddingPoliciesComponent
vectorEmbeddings={this.state.vectorEmbeddingPolicy}
vectorIndexes={this.state.vectorIndexingPolicy}
onVectorEmbeddingChange={(
vectorEmbeddingPolicy: DataModels.VectorEmbedding[],
vectorIndexingPolicy: DataModels.VectorIndex[],
@@ -906,6 +909,34 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
</CollapsibleSectionComponent>
</Stack>
)}
{this.shouldShowFullTextSearchParameters() && (
<Stack>
<CollapsibleSectionComponent
title="Container Full Text Search Policy"
isExpandedByDefault={false}
onExpand={() => {
this.scrollToSection("collapsibleFullTextPolicySectionContent");
}}
//TODO: uncomment when learn more text becomes available
// tooltipContent={this.getContainerFullTextPolicyTooltipContent()}
>
<Stack id="collapsibleFullTextPolicySectionContent" styles={{ root: { position: "relative" } }}>
<Stack styles={{ root: { paddingLeft: 40 } }}>
<FullTextPoliciesComponent
fullTextPolicy={this.state.fullTextPolicy}
onFullTextPathChange={(
fullTextPolicy: DataModels.FullTextPolicy,
fullTextIndexes: DataModels.FullTextIndex[],
fullTextPolicyValidated: boolean,
) => {
this.setState({ fullTextPolicy, fullTextIndexes, fullTextPolicyValidated });
}}
/>
</Stack>
</Stack>
</CollapsibleSectionComponent>
</Stack>
)}
{userContext.apiType !== "Tables" && (
<CollapsibleSectionComponent
title="Advanced"
@@ -1138,10 +1169,6 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
return userContext.databaseAccount?.properties?.enableFreeTier;
}
private getSharedThroughputDefault(): boolean {
return userContext.subscriptionType !== SubscriptionType.EA && !isServerlessAccount();
}
private getFreeTierIndexingText(): string {
return this.state.enableIndexing
? "All properties in your documents will be indexed by default for flexible and efficient queries."
@@ -1191,12 +1218,16 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
return "";
}
private getAnalyticalStorageTooltipContent(): JSX.Element {
private getAnalyticalStorageContent(): JSX.Element {
return (
<Text variant="small">
Enable analytical store capability to perform near real-time analytics on your operational data, without
impacting the performance of transactional workloads.{" "}
<Link target="_blank" href="https://aka.ms/analytical-store-overview">
<Link
aria-label={Constants.ariaLabelForLearnMoreLink.AnalyticalStore}
target="_blank"
href="https://aka.ms/analytical-store-overview"
>
Learn more
</Link>
</Text>
@@ -1215,6 +1246,19 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
);
}
//TODO: uncomment when learn more text becomes available
// private getContainerFullTextPolicyTooltipContent(): JSX.Element {
// return (
// <Text variant="small">
// Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore
// magna aliqua.{" "}
// <Link target="_blank" href="https://aka.ms/CosmosFullTextSearch">
// Learn more
// </Link>
// </Text>
// );
// }
private shouldShowCollectionThroughputInput(): boolean {
if (isServerlessAccount()) {
return false;
@@ -1278,6 +1322,10 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
return isVectorSearchEnabled() && (isServerlessAccount() || this.shouldShowCollectionThroughputInput());
}
private shouldShowFullTextSearchParameters() {
return isFullTextSearchEnabled() && (isServerlessAccount() || this.shouldShowCollectionThroughputInput());
}
private parseUniqueKeys(): DataModels.UniqueKeyPolicy {
if (this.state.uniqueKeys?.length === 0) {
return undefined;
@@ -1334,9 +1382,16 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
return false;
}
if (this.shouldShowVectorSearchParameters() && !this.state.vectorPolicyValidated) {
this.setState({ errorMessage: "Please fix errors in container vector policy" });
return false;
if (this.shouldShowVectorSearchParameters()) {
if (!this.state.vectorPolicyValidated) {
this.setState({ errorMessage: "Please fix errors in container vector policy" });
return false;
}
if (!this.state.fullTextPolicyValidated) {
this.setState({ errorMessage: "Please fix errors in container full text search polilcy" });
return false;
}
}
return true;
@@ -1427,6 +1482,10 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
};
}
if (this.shouldShowFullTextSearchParameters()) {
indexingPolicy.fullTextIndexes = this.state.fullTextIndexes;
}
const telemetryData = {
database: {
id: databaseId,
@@ -1486,6 +1545,7 @@ export class AddCollectionPanel extends React.Component<AddCollectionPanelProps,
uniqueKeyPolicy,
createMongoWildcardIndex: this.state.createMongoWildCardIndex,
vectorEmbeddingPolicy,
fullTextPolicy: this.state.fullTextPolicy,
};
this.setState({ isExecuting: true });

View File

@@ -1,4 +1,5 @@
import { Checkbox, Stack, Text, TextField } from "@fluentui/react";
import { getNewDatabaseSharedThroughputDefault } from "Common/DatabaseUtility";
import React, { FunctionComponent, useEffect, useState } from "react";
import * as Constants from "../../../Common/Constants";
import { getErrorMessage, getErrorStack } from "../../../Common/ErrorHandlingUtils";
@@ -48,7 +49,7 @@ export const AddDatabasePanel: FunctionComponent<AddDatabasePaneProps> = ({
const databaseLevelThroughputTooltipText = `Provisioned throughput at the ${databaseLabel} level will be shared across all ${collectionsLabel} within the ${databaseLabel}.`;
const [databaseCreateNewShared, setDatabaseCreateNewShared] = useState<boolean>(
subscriptionType !== SubscriptionType.EA && !isServerlessAccount(),
getNewDatabaseSharedThroughputDefault(),
);
const [formErrors, setFormErrors] = useState<string>("");
const [isExecuting, setIsExecuting] = useState<boolean>(false);

View File

@@ -65,7 +65,7 @@ exports[`AddDatabasePane Pane should render Default properly 1`] = `
horizontal={true}
>
<StyledCheckboxBase
checked={true}
checked={false}
label="Provision throughput"
onChange={[Function]}
styles={
@@ -90,14 +90,6 @@ exports[`AddDatabasePane Pane should render Default properly 1`] = `
</InfoTooltip>
</Stack>
</Stack>
<ThroughputInput
isDatabase={true}
isSharded={true}
onCostAcknowledgeChange={[Function]}
setIsAutoscale={[Function]}
setIsThroughputCapExceeded={[Function]}
setThroughputValue={[Function]}
/>
</div>
</RightPaneForm>
`;

View File

@@ -124,7 +124,7 @@ export const DeleteDatabaseConfirmationPanel: FunctionComponent<DeleteDatabaseCo
message:
"Warning! The action you are about to take cannot be undone. Continuing will permanently delete this resource and all of its children resources.",
};
const confirmDatabase = `Confirm by typing the ${getDatabaseName()} id`;
const confirmDatabase = `Confirm by typing the ${getDatabaseName()} id (name)`;
const reasonInfo = `Help us improve Azure Cosmos DB! What is the reason why you are deleting this ${getDatabaseName()}?`;
return (
<RightPaneForm {...props}>
@@ -132,7 +132,7 @@ export const DeleteDatabaseConfirmationPanel: FunctionComponent<DeleteDatabaseCo
<div className="panelMainContent">
<div className="confirmDeleteInput">
<span className="mandatoryStar">* </span>
<Text variant="small">Confirm by typing the {getDatabaseName()} id</Text>
<Text variant="small">{confirmDatabase}</Text>
<TextField
id="confirmDatabaseId"
data-test="Input:confirmDatabaseId"

View File

@@ -1,3 +1,7 @@
import {
AuthError as msalAuthError,
BrowserAuthErrorMessage as msalBrowserAuthErrorMessage,
} from "@azure/msal-browser";
import {
Checkbox,
ChoiceGroup,
@@ -5,8 +9,6 @@ import {
IChoiceGroupOption,
ISpinButtonStyles,
IToggleStyles,
MessageBar,
MessageBarType,
Position,
SpinButton,
Toggle,
@@ -30,6 +32,7 @@ import {
} from "Shared/StorageUtility";
import * as StringUtility from "Shared/StringUtility";
import { updateUserContext, userContext } from "UserContext";
import { acquireMsalTokenForAccount } from "Utils/AuthorizationUtils";
import { logConsoleError, logConsoleInfo } from "Utils/NotificationConsoleUtils";
import * as PriorityBasedExecutionUtils from "Utils/PriorityBasedExecutionUtils";
import { getReadOnlyKeys, listKeys } from "Utils/arm/generatedClients/cosmos/databaseAccounts";
@@ -108,7 +111,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
? LocalStorageUtility.getEntryString(StorageKey.DataPlaneRbacEnabled)
: Constants.RBACOptions.setAutomaticRBACOption,
);
const [showDataPlaneRBACWarning, setShowDataPlaneRBACWarning] = useState<boolean>(false);
const [ruThresholdEnabled, setRUThresholdEnabled] = useState<boolean>(isRUThresholdEnabled());
const [ruThreshold, setRUThreshold] = useState<number>(getRUThreshold());
@@ -172,15 +174,26 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
const styles = useStyles();
const explorerVersion = configContext.gitSha;
const isEmulator = configContext.platform === Platform.Emulator;
const shouldShowQueryPageOptions = userContext.apiType === "SQL";
const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin";
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin";
const shouldShowParallelismOption = userContext.apiType !== "Gremlin";
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled();
const showRetrySettings =
(userContext.apiType === "SQL" || userContext.apiType === "Tables" || userContext.apiType === "Gremlin") &&
!isEmulator;
const shouldShowGraphAutoVizOption = userContext.apiType === "Gremlin" && !isEmulator;
const shouldShowCrossPartitionOption = userContext.apiType !== "Gremlin" && !isEmulator;
const shouldShowParallelismOption = userContext.apiType !== "Gremlin" && !isEmulator;
const showEnableEntraIdRbac =
userContext.apiType === "SQL" &&
userContext.authType === AuthType.AAD &&
configContext.platform !== Platform.Fabric &&
!isEmulator;
const shouldShowPriorityLevelOption = PriorityBasedExecutionUtils.isFeatureEnabled() && !isEmulator;
const shouldShowCopilotSampleDBOption =
userContext.apiType === "SQL" &&
useQueryCopilot.getState().copilotEnabled &&
useDatabases.getState().sampleDataResourceTokenCollection;
useDatabases.getState().sampleDataResourceTokenCollection &&
!isEmulator;
const handlerOnSubmit = async () => {
setIsExecuting(true);
@@ -191,6 +204,17 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
LocalStorageUtility.setEntryNumber(StorageKey.CustomItemPerPage, customItemPerPage);
if (
enableDataPlaneRBACOption !== LocalStorageUtility.getEntryString(StorageKey.DataPlaneRbacEnabled) ||
retryAttempts !== LocalStorageUtility.getEntryNumber(StorageKey.RetryAttempts) ||
retryInterval !== LocalStorageUtility.getEntryNumber(StorageKey.RetryInterval) ||
MaxWaitTimeInSeconds !== LocalStorageUtility.getEntryNumber(StorageKey.MaxWaitTimeInSeconds)
) {
updateUserContext({
refreshCosmosClient: true,
});
}
if (configContext.platform !== Platform.Fabric) {
LocalStorageUtility.setEntryString(StorageKey.DataPlaneRbacEnabled, enableDataPlaneRBACOption);
if (
@@ -200,13 +224,29 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
) {
updateUserContext({
dataPlaneRbacEnabled: true,
hasDataPlaneRbacSettingChanged: true,
});
useDataPlaneRbac.setState({ dataPlaneRbacEnabled: true });
try {
const aadToken = await acquireMsalTokenForAccount(userContext.databaseAccount, true);
updateUserContext({ aadToken: aadToken });
useDataPlaneRbac.setState({ aadTokenUpdated: true });
} catch (authError) {
if (
authError instanceof msalAuthError &&
authError.errorCode === msalBrowserAuthErrorMessage.popUpWindowError.code
) {
logConsoleError(
`We were unable to establish authorization for this account, due to pop-ups being disabled in the browser.\nPlease enable pop-ups for this site and click on "Login for Entra ID" button`,
);
} else {
logConsoleError(
`"Failed to acquire authorization token automatically. Please click on "Login for Entra ID" button to enable Entra ID RBAC operations`,
);
}
}
} else {
updateUserContext({
dataPlaneRbacEnabled: false,
hasDataPlaneRbacSettingChanged: true,
});
const { databaseAccount: account, subscriptionId, resourceGroup } = userContext;
if (!userContext.features.enableAadDataPlane && !userContext.masterKey) {
@@ -347,13 +387,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
option: IChoiceGroupOption,
): void => {
setEnableDataPlaneRBACOption(option.key);
const shouldShowWarning =
(option.key === Constants.RBACOptions.setTrueRBACOption ||
(option.key === Constants.RBACOptions.setAutomaticRBACOption &&
userContext.databaseAccount.properties.disableLocalAuth === true)) &&
!useDataPlaneRbac.getState().aadTokenUpdated;
setShowDataPlaneRBACWarning(shouldShowWarning);
};
const handleOnRUThresholdToggleChange = (ev: React.MouseEvent<HTMLElement>, checked?: boolean): void => {
@@ -469,7 +502,7 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
return (
<RightPaneForm {...genericPaneProps}>
<div className={`paneMainContent ${styles.container}`}>
<Accordion className={styles.firstItem}>
<Accordion className={`customAccordion ${styles.firstItem}`}>
{shouldShowQueryPageOptions && (
<AccordionItem value="1">
<AccordionHeader>
@@ -519,51 +552,37 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" &&
userContext.authType === AuthType.AAD &&
configContext.platform !== Platform.Fabric && (
<AccordionItem value="2">
<AccordionHeader>
<div className={styles.header}>Enable Entra ID RBAC</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
{showDataPlaneRBACWarning && configContext.platform === Platform.Portal && (
<MessageBar
messageBarType={MessageBarType.warning}
isMultiline={true}
onDismiss={() => setShowDataPlaneRBACWarning(false)}
dismissButtonAriaLabel="Close"
>
Please click on &quot;Login for Entra ID RBAC&quot; button prior to performing Entra ID RBAC
operations
</MessageBar>
)}
<div className={styles.settingsSectionDescription}>
Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra
ID RBAC.
<a
href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer"
target="_blank"
rel="noopener noreferrer"
>
{" "}
Learn more{" "}
</a>
</div>
<ChoiceGroup
ariaLabelledBy="enableDataPlaneRBACOptions"
options={dataPlaneRBACOptionsList}
styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
{showEnableEntraIdRbac && (
<AccordionItem value="2">
<AccordionHeader>
<div className={styles.header}>Enable Entra ID RBAC</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Choose Automatic to enable Entra ID RBAC automatically. True/False to force enable/disable Entra ID
RBAC.
<a
href="https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-setup-rbac#use-data-explorer"
target="_blank"
rel="noopener noreferrer"
>
{" "}
Learn more{" "}
</a>
</div>
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" && (
<ChoiceGroup
ariaLabelledBy="enableDataPlaneRBACOptions"
options={dataPlaneRBACOptionsList}
styles={choiceButtonStyles}
selectedKey={enableDataPlaneRBACOption}
onChange={handleOnDataPlaneRBACOptionChange}
/>
</div>
</AccordionPanel>
</AccordionItem>
)}
{userContext.apiType === "SQL" && !isEmulator && (
<>
<AccordionItem value="3">
<AccordionHeader>
@@ -661,102 +680,103 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionItem>
</>
)}
<AccordionItem value="6">
<AccordionHeader>
<div className={styles.header}>Retry Settings</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Retry policy associated with throttled requests during CosmosDB queries.
{showRetrySettings && (
<AccordionItem value="6">
<AccordionHeader>
<div className={styles.header}>Retry Settings</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Retry policy associated with throttled requests during CosmosDB queries.
</div>
<div>
<span className={styles.subHeader}>Max retry attempts</span>
<InfoTooltip className={styles.headerIcon}>
Max number of retries to be performed for a request. Default value 9.
</InfoTooltip>
</div>
<SpinButton
labelPosition={Position.top}
min={1}
step={1}
value={"" + retryAttempts}
onChange={handleOnQueryRetryAttemptsSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1"
decrementButtonAriaLabel="Decrease value by 1"
onIncrement={(newValue) => setRetryAttempts(parseInt(newValue) + 1 || retryAttempts)}
onDecrement={(newValue) => setRetryAttempts(parseInt(newValue) - 1 || retryAttempts)}
onValidate={(newValue) => setRetryAttempts(parseInt(newValue) || retryAttempts)}
styles={spinButtonStyles}
/>
<div>
<span className={styles.subHeader}>Fixed retry interval (ms)</span>
<InfoTooltip className={styles.headerIcon}>
Fixed retry interval in milliseconds to wait between each retry ignoring the retryAfter returned
as part of the response. Default value is 0 milliseconds.
</InfoTooltip>
</div>
<SpinButton
labelPosition={Position.top}
min={1000}
step={1000}
value={"" + retryInterval}
onChange={handleOnRetryIntervalSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1000"
decrementButtonAriaLabel="Decrease value by 1000"
onIncrement={(newValue) => setRetryInterval(parseInt(newValue) + 1000 || retryInterval)}
onDecrement={(newValue) => setRetryInterval(parseInt(newValue) - 1000 || retryInterval)}
onValidate={(newValue) => setRetryInterval(parseInt(newValue) || retryInterval)}
styles={spinButtonStyles}
/>
<div>
<span className={styles.subHeader}>Max wait time (s)</span>
<InfoTooltip className={styles.headerIcon}>
Max wait time in seconds to wait for a request while the retries are happening. Default value 30
seconds.
</InfoTooltip>
</div>
<SpinButton
labelPosition={Position.top}
min={1}
step={1}
value={"" + MaxWaitTimeInSeconds}
onChange={handleOnMaxWaitTimeSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1"
decrementButtonAriaLabel="Decrease value by 1"
onIncrement={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) + 1 || MaxWaitTimeInSeconds)}
onDecrement={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) - 1 || MaxWaitTimeInSeconds)}
onValidate={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) || MaxWaitTimeInSeconds)}
styles={spinButtonStyles}
/>
</div>
<div>
<span className={styles.subHeader}>Max retry attempts</span>
<InfoTooltip className={styles.headerIcon}>
Max number of retries to be performed for a request. Default value 9.
</InfoTooltip>
</AccordionPanel>
</AccordionItem>
)}
{!isEmulator && (
<AccordionItem value="7">
<AccordionHeader>
<div className={styles.header}>Enable container pagination</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order.
</div>
<Checkbox
styles={{
label: { padding: 0 },
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div>
<SpinButton
labelPosition={Position.top}
min={1}
step={1}
value={"" + retryAttempts}
onChange={handleOnQueryRetryAttemptsSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1"
decrementButtonAriaLabel="Decrease value by 1"
onIncrement={(newValue) => setRetryAttempts(parseInt(newValue) + 1 || retryAttempts)}
onDecrement={(newValue) => setRetryAttempts(parseInt(newValue) - 1 || retryAttempts)}
onValidate={(newValue) => setRetryAttempts(parseInt(newValue) || retryAttempts)}
styles={spinButtonStyles}
/>
<div>
<span className={styles.subHeader}>Fixed retry interval (ms)</span>
<InfoTooltip className={styles.headerIcon}>
Fixed retry interval in milliseconds to wait between each retry ignoring the retryAfter returned as
part of the response. Default value is 0 milliseconds.
</InfoTooltip>
</div>
<SpinButton
labelPosition={Position.top}
min={1000}
step={1000}
value={"" + retryInterval}
onChange={handleOnRetryIntervalSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1000"
decrementButtonAriaLabel="Decrease value by 1000"
onIncrement={(newValue) => setRetryInterval(parseInt(newValue) + 1000 || retryInterval)}
onDecrement={(newValue) => setRetryInterval(parseInt(newValue) - 1000 || retryInterval)}
onValidate={(newValue) => setRetryInterval(parseInt(newValue) || retryInterval)}
styles={spinButtonStyles}
/>
<div>
<span className={styles.subHeader}>Max wait time (s)</span>
<InfoTooltip className={styles.headerIcon}>
Max wait time in seconds to wait for a request while the retries are happening. Default value 30
seconds.
</InfoTooltip>
</div>
<SpinButton
labelPosition={Position.top}
min={1}
step={1}
value={"" + MaxWaitTimeInSeconds}
onChange={handleOnMaxWaitTimeSpinButtonChange}
incrementButtonAriaLabel="Increase value by 1"
decrementButtonAriaLabel="Decrease value by 1"
onIncrement={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) + 1 || MaxWaitTimeInSeconds)}
onDecrement={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) - 1 || MaxWaitTimeInSeconds)}
onValidate={(newValue) => setMaxWaitTimeInSeconds(parseInt(newValue) || MaxWaitTimeInSeconds)}
styles={spinButtonStyles}
/>
</div>
</AccordionPanel>
</AccordionItem>
<AccordionItem value="7">
<AccordionHeader>
<div className={styles.header}>Enable container pagination</div>
</AccordionHeader>
<AccordionPanel>
<div className={styles.settingsSectionContainer}>
<div className={styles.settingsSectionDescription}>
Load 50 containers at a time. Currently, containers are not pulled in alphanumeric order.
</div>
<Checkbox
styles={{
label: { padding: 0 },
}}
className="padding"
ariaLabel="Enable container pagination"
checked={containerPaginationEnabled}
onChange={() => setContainerPaginationEnabled(!containerPaginationEnabled)}
label="Enable container pagination"
/>
</div>
</AccordionPanel>
</AccordionItem>
</AccordionPanel>
</AccordionItem>
)}
{shouldShowCrossPartitionOption && (
<AccordionItem value="8">
<AccordionHeader>
@@ -782,7 +802,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{shouldShowParallelismOption && (
<AccordionItem value="9">
<AccordionHeader>
@@ -816,7 +835,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{shouldShowPriorityLevelOption && (
<AccordionItem value="10">
<AccordionHeader>
@@ -840,7 +858,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{shouldShowGraphAutoVizOption && (
<AccordionItem value="11">
<AccordionHeader>
@@ -862,7 +879,6 @@ export const SettingsPane: FunctionComponent<{ explorer: Explorer }> = ({
</AccordionPanel>
</AccordionItem>
)}
{shouldShowCopilotSampleDBOption && (
<AccordionItem value="12">
<AccordionHeader>

View File

@@ -11,7 +11,7 @@ exports[`Settings Pane should render Default properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
>
<Accordion
className="___1uf6361_0000000 fz7g6wx"
className="customAccordion ___1uf6361_0000000 fz7g6wx"
>
<AccordionItem
value="1"
@@ -572,7 +572,7 @@ exports[`Settings Pane should render Gremlin properly 1`] = `
className="paneMainContent ___133e6fg_0000000 f22iagw f1vx9l62 f1l02sjl"
>
<Accordion
className="___1uf6361_0000000 fz7g6wx"
className="customAccordion ___1uf6361_0000000 fz7g6wx"
>
<AccordionItem
value="6"

View File

@@ -1,300 +0,0 @@
import {
DefaultButton,
Dropdown,
IDropdownOption,
IStyleFunctionOrObject,
ITextFieldStyleProps,
ITextFieldStyles,
IconButton,
Label,
Stack,
TextField,
} from "@fluentui/react";
import { VectorEmbedding, VectorIndex } from "Contracts/DataModels";
import { CollapsibleSectionComponent } from "Explorer/Controls/CollapsiblePanel/CollapsibleSectionComponent";
import {
getDataTypeOptions,
getDistanceFunctionOptions,
getIndexTypeOptions,
} from "Explorer/Panes/VectorSearchPanel/VectorSearchUtils";
import React, { FunctionComponent, useState } from "react";
export interface IAddVectorEmbeddingPolicyFormProps {
vectorEmbedding: VectorEmbedding[];
vectorIndex: VectorIndex[];
onVectorEmbeddingChange: (
vectorEmbeddings: VectorEmbedding[],
vectorIndexingPolicies: VectorIndex[],
validationPassed: boolean,
) => void;
}
export interface VectorEmbeddingPolicyData {
path: string;
dataType: VectorEmbedding["dataType"];
distanceFunction: VectorEmbedding["distanceFunction"];
dimensions: number;
indexType: VectorIndex["type"] | "none";
pathError: string;
dimensionsError: string;
}
type VectorEmbeddingPolicyProperty = "dataType" | "distanceFunction" | "indexType";
const textFieldStyles: IStyleFunctionOrObject<ITextFieldStyleProps, ITextFieldStyles> = {
fieldGroup: {
height: 27,
},
field: {
fontSize: 12,
padding: "0 8px",
},
};
const dropdownStyles = {
title: {
height: 27,
lineHeight: "24px",
fontSize: 12,
},
dropdown: {
height: 27,
lineHeight: "24px",
},
dropdownItem: {
fontSize: 12,
},
};
export const AddVectorEmbeddingPolicyForm: FunctionComponent<IAddVectorEmbeddingPolicyFormProps> = ({
vectorEmbedding,
vectorIndex,
onVectorEmbeddingChange,
}): JSX.Element => {
const onVectorEmbeddingPathError = (path: string, index?: number): string => {
let error = "";
if (!path) {
error = "Vector embedding path should not be empty";
}
if (
index >= 0 &&
vectorEmbeddingPolicyData?.find(
(vectorEmbedding: VectorEmbeddingPolicyData, dataIndex: number) =>
dataIndex !== index && vectorEmbedding.path === path,
)
) {
error = "Vector embedding path is already defined";
}
return error;
};
const onVectorEmbeddingDimensionError = (dimension: number, indexType: VectorIndex["type"] | "none"): string => {
let error = "";
if (dimension <= 0 || dimension > 4096) {
error = "Vector embedding dimension must be greater than 0 and less than or equal 4096";
}
if (indexType === "flat" && dimension > 505) {
error = "Maximum allowed dimension for flat index is 505";
}
return error;
};
const initializeData = (vectorEmbedding: VectorEmbedding[], vectorIndex: VectorIndex[]) => {
const mergedData: VectorEmbeddingPolicyData[] = [];
vectorEmbedding.forEach((embedding) => {
const matchingIndex = vectorIndex.find((index) => index.path === embedding.path);
mergedData.push({
...embedding,
indexType: matchingIndex?.type || "none",
pathError: onVectorEmbeddingPathError(embedding.path),
dimensionsError: onVectorEmbeddingDimensionError(embedding.dimensions, matchingIndex?.type || "none"),
});
});
return mergedData;
};
const [vectorEmbeddingPolicyData, setVectorEmbeddingPolicyData] = useState<VectorEmbeddingPolicyData[]>(
initializeData(vectorEmbedding, vectorIndex),
);
React.useEffect(() => {
propagateData();
}, [vectorEmbeddingPolicyData]);
const propagateData = () => {
const vectorEmbeddings: VectorEmbedding[] = vectorEmbeddingPolicyData.map((policy: VectorEmbeddingPolicyData) => ({
dataType: policy.dataType,
dimensions: policy.dimensions,
distanceFunction: policy.distanceFunction,
path: policy.path,
}));
const vectorIndexingPolicies: VectorIndex[] = vectorEmbeddingPolicyData
.filter((policy: VectorEmbeddingPolicyData) => policy.indexType !== "none")
.map(
(policy) =>
({
path: policy.path,
type: policy.indexType,
}) as VectorIndex,
);
const validationPassed = vectorEmbeddingPolicyData.every(
(policy: VectorEmbeddingPolicyData) => policy.pathError === "" && policy.dimensionsError === "",
);
onVectorEmbeddingChange(vectorEmbeddings, vectorIndexingPolicies, validationPassed);
};
const onVectorEmbeddingPathChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = event.target.value.trim();
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
if (!vectorEmbeddings[index]?.path && !value.startsWith("/")) {
vectorEmbeddings[index].path = "/" + value;
} else {
vectorEmbeddings[index].path = value;
}
const error = onVectorEmbeddingPathError(value, index);
vectorEmbeddings[index].pathError = error;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onVectorEmbeddingDimensionsChange = (index: number, event: React.ChangeEvent<HTMLInputElement>) => {
const value = parseInt(event.target.value.trim()) || 0;
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
const vectorEmbedding = vectorEmbeddings[index];
vectorEmbeddings[index].dimensions = value;
const error = onVectorEmbeddingDimensionError(value, vectorEmbedding.indexType);
vectorEmbeddings[index].dimensionsError = error;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onVectorEmbeddingIndexTypeChange = (index: number, option: IDropdownOption): void => {
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
const vectorEmbedding = vectorEmbeddings[index];
vectorEmbeddings[index].indexType = option.key as never;
const error = onVectorEmbeddingDimensionError(vectorEmbedding.dimensions, vectorEmbedding.indexType);
vectorEmbeddings[index].dimensionsError = error;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onVectorEmbeddingPolicyChange = (
index: number,
option: IDropdownOption,
property: VectorEmbeddingPolicyProperty,
): void => {
const vectorEmbeddings = [...vectorEmbeddingPolicyData];
vectorEmbeddings[index][property] = option.key as never;
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
const onAdd = () => {
setVectorEmbeddingPolicyData([
...vectorEmbeddingPolicyData,
{
path: "",
dataType: "float32",
distanceFunction: "euclidean",
dimensions: 0,
indexType: "none",
pathError: onVectorEmbeddingPathError(""),
dimensionsError: onVectorEmbeddingDimensionError(0, "none"),
},
]);
};
const onDelete = (index: number) => {
const vectorEmbeddings = vectorEmbeddingPolicyData.filter((_uniqueKey, j) => index !== j);
setVectorEmbeddingPolicyData(vectorEmbeddings);
};
return (
<Stack tokens={{ childrenGap: 4 }}>
{vectorEmbeddingPolicyData.length > 0 &&
vectorEmbeddingPolicyData.map((vectorEmbeddingPolicy: VectorEmbeddingPolicyData, index: number) => (
<CollapsibleSectionComponent key={index} isExpandedByDefault={true} title={`Vector embedding ${index + 1}`}>
<Stack horizontal tokens={{ childrenGap: 4 }}>
<Stack
styles={{
root: {
margin: "0 0 6px 20px !important",
paddingLeft: 20,
width: "80%",
borderLeft: "1px solid",
},
}}
>
<Stack>
<Label styles={{ root: { fontSize: 12 } }}>Path</Label>
<TextField
id={`vector-policy-path-${index + 1}`}
required={true}
placeholder="/vector1"
styles={textFieldStyles}
onChange={(event: React.ChangeEvent<HTMLInputElement>) => onVectorEmbeddingPathChange(index, event)}
value={vectorEmbeddingPolicy.path || ""}
errorMessage={vectorEmbeddingPolicy.pathError}
/>
</Stack>
<Stack>
<Label styles={{ root: { fontSize: 12 } }}>Data type</Label>
<Dropdown
required={true}
styles={dropdownStyles}
options={getDataTypeOptions()}
selectedKey={vectorEmbeddingPolicy.dataType}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingPolicyChange(index, option, "dataType")
}
></Dropdown>
</Stack>
<Stack>
<Label styles={{ root: { fontSize: 12 } }}>Distance function</Label>
<Dropdown
required={true}
styles={dropdownStyles}
options={getDistanceFunctionOptions()}
selectedKey={vectorEmbeddingPolicy.distanceFunction}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingPolicyChange(index, option, "distanceFunction")
}
></Dropdown>
</Stack>
<Stack>
<Label styles={{ root: { fontSize: 12 } }}>Dimensions</Label>
<TextField
id={`vector-policy-dimension-${index + 1}`}
required={true}
styles={textFieldStyles}
value={String(vectorEmbeddingPolicy.dimensions || 0)}
onChange={(event: React.ChangeEvent<HTMLInputElement>) =>
onVectorEmbeddingDimensionsChange(index, event)
}
errorMessage={vectorEmbeddingPolicy.dimensionsError}
/>
</Stack>
<Stack>
<Label styles={{ root: { fontSize: 12 } }}>Index type</Label>
<Dropdown
required={true}
styles={dropdownStyles}
options={getIndexTypeOptions()}
selectedKey={vectorEmbeddingPolicy.indexType}
onChange={(_event: React.FormEvent<HTMLDivElement>, option: IDropdownOption) =>
onVectorEmbeddingIndexTypeChange(index, option)
}
></Dropdown>
</Stack>
</Stack>
<IconButton
id={`delete-vector-policy-${index + 1}`}
iconProps={{ iconName: "Delete" }}
style={{ height: 27, margin: "auto" }}
onClick={() => onDelete(index)}
/>
</Stack>
</CollapsibleSectionComponent>
))}
<DefaultButton id={`add-vector-policy`} styles={{ root: { maxWidth: 170, fontSize: 12 } }} onClick={onAdd}>
Add vector embedding
</DefaultButton>
</Stack>
);
};

View File

@@ -106,7 +106,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
horizontal={true}
>
<StyledCheckboxBase
checked={true}
checked={false}
label="Share throughput across containers"
onChange={[Function]}
styles={
@@ -137,14 +137,6 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
/>
</StyledTooltipHostBase>
</Stack>
<ThroughputInput
isDatabase={true}
isSharded={true}
onCostAcknowledgeChange={[Function]}
setIsAutoscale={[Function]}
setIsThroughputCapExceeded={[Function]}
setThroughputValue={[Function]}
/>
</Stack>
<Separator
className="panelSeparator"
@@ -263,6 +255,14 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
</CustomizedDefaultButton>
</Stack>
</Stack>
<ThroughputInput
isDatabase={false}
isSharded={true}
onCostAcknowledgeChange={[Function]}
setIsAutoscale={[Function]}
setIsThroughputCapExceeded={[Function]}
setThroughputValue={[Function]}
/>
<Stack>
<Stack
horizontal={true}
@@ -309,40 +309,24 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
<Stack
className="panelGroupSpacing"
>
<Stack
horizontal={true}
<Text
className="panelTextBold"
variant="small"
>
<Text
className="panelTextBold"
variant="small"
>
Analytical store
Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads.
<StyledLinkBase
aria-label="Learn more about analytical store."
href="https://aka.ms/analytical-store-overview"
target="_blank"
>
Learn more
</StyledLinkBase>
</Text>
<StyledTooltipHostBase
content={
<Text
variant="small"
>
Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads.
<StyledLinkBase
href="https://aka.ms/analytical-store-overview"
target="_blank"
>
Learn more
</StyledLinkBase>
</Text>
}
directionalHint={4}
>
<Icon
ariaLabel="Enable analytical store capability to perform near real-time analytics on your operational data, without impacting the performance of transactional workloads."
className="panelInfoIcon"
iconName="Info"
tabIndex={0}
/>
</StyledTooltipHostBase>
</Stack>
</Text>
<Stack
horizontal={true}
verticalAlign="center"
@@ -400,6 +384,7 @@ exports[`AddCollectionPanel should render Default properly 1`] = `
. Enable Synapse Link for this Cosmos DB account.
<StyledLinkBase
aria-label="Learn more about Azure Synapse Link."
className="capacitycalculator-link"
href="https://aka.ms/cosmosdb-synapselink"
target="_blank"

View File

@@ -361,13 +361,11 @@ exports[`Delete Database Confirmation Pane Should call delete database 1`] = `
<span
className="css-113"
>
Confirm by typing the
Database
id
Confirm by typing the Database id (name)
</span>
</Text>
<StyledTextFieldBase
ariaLabel="Confirm by typing the Database id"
ariaLabel="Confirm by typing the Database id (name)"
autoFocus={true}
data-test="Input:confirmDatabaseId"
id="confirmDatabaseId"
@@ -382,7 +380,7 @@ exports[`Delete Database Confirmation Pane Should call delete database 1`] = `
}
>
<TextFieldBase
ariaLabel="Confirm by typing the Database id"
ariaLabel="Confirm by typing the Database id (name)"
autoFocus={true}
data-test="Input:confirmDatabaseId"
deferredValidationTime={200}
@@ -677,7 +675,7 @@ exports[`Delete Database Confirmation Pane Should call delete database 1`] = `
>
<input
aria-invalid={false}
aria-label="Confirm by typing the Database id"
aria-label="Confirm by typing the Database id (name)"
autoFocus={true}
className="ms-TextField-field field-117"
data-test="Input:confirmDatabaseId"

View File

@@ -79,9 +79,13 @@ export const QueryCopilotFeedbackModal = ({
readOnly
/>
<Text style={{ fontSize: 12, marginBottom: 14 }}>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the{" "}
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to
improve your and your organizations experience with this product. If you have any questions about the use
of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the
Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the
feedback you submit is considered Personal Data under that addendum. Please see the{" "}
{
<Link href="https://privacy.microsoft.com/privacystatement" target="_blank">
<Link href="https://go.microsoft.com/fwlink/?LinkId=521839" target="_blank">
Privacy statement
</Link>
}{" "}

View File

@@ -99,10 +99,10 @@ exports[`Query Copilot Feedback Modal snapshot test shoud render and match snaps
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -236,10 +236,10 @@ exports[`Query Copilot Feedback Modal snapshot test should cancel submission 1`]
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -373,10 +373,10 @@ exports[`Query Copilot Feedback Modal snapshot test should close on cancel click
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -510,10 +510,10 @@ exports[`Query Copilot Feedback Modal snapshot test should get user unput 1`] =
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -647,10 +647,10 @@ exports[`Query Copilot Feedback Modal snapshot test should not render dont show
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -784,10 +784,10 @@ exports[`Query Copilot Feedback Modal snapshot test should render dont show agai
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement
@@ -936,10 +936,10 @@ exports[`Query Copilot Feedback Modal snapshot test should submit submission 1`]
}
}
>
By pressing submit, your feedback will be used to improve Microsoft products and services. Please see the
Microsoft will process the feedback you submit pursuant to your organizations instructions in order to improve your and your organizations experience with this product. If you have any questions about the use of feedback data, please contact your tenant administrator. Processing of feedback data is governed by the Microsoft Products and Services Data Protection Addendum between your organization and Microsoft, and the feedback you submit is considered Personal Data under that addendum. Please see the
<StyledLinkBase
href="https://privacy.microsoft.com/privacystatement"
href="https://go.microsoft.com/fwlink/?LinkId=521839"
target="_blank"
>
Privacy statement

View File

@@ -75,6 +75,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
const inputEdited = useRef(false);
const itemRefs = useRef([]);
const searchInputRef = useRef(null);
const copyQueryRef = useRef(null);
const {
openFeedbackModal,
hideFeedbackModalForLikedQueries,
@@ -132,6 +133,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
document.body.removeChild(queryElement);
setshowCopyPopup(true);
copyQueryRef.current.focus();
setTimeout(() => {
setshowCopyPopup(false);
}, 6000);
@@ -305,7 +307,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
if (isGeneratingQuery === null) {
return " ";
} else if (isGeneratingQuery) {
return "Content is loading";
return "Thinking";
} else {
return "Content is updated";
}
@@ -400,6 +402,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
<IconButton
iconProps={{ iconName: "Send" }}
disabled={isGeneratingQuery || !userPrompt.trim()}
allowDisabledFocus={true}
style={{ background: "none" }}
onClick={() => startGenerateQueryProcess()}
aria-label="Send"
@@ -676,6 +679,7 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)}
<CommandBarButton
className="copyQuery"
elementRef={copyQueryRef}
onClick={copyGeneratedCode}
iconProps={{ iconName: "Copy" }}
style={{ fontSize: 12, transition: "background-color 0.3s ease", height: "100%" }}
@@ -706,6 +710,9 @@ export const QueryCopilotPromptbar: React.FC<QueryCopilotPromptProps> = ({
)}
</Stack>
)}
{(showFeedbackBar || isGeneratingQuery) && (
<span role="alert" className="screenReaderOnly" aria-label={getAriaLabel()} />
)}
{isGeneratingQuery && (
<ProgressIndicator
label="Thinking..."

View File

@@ -1,7 +1,6 @@
import { FeedOptions } from "@azure/cosmos";
import {
Areas,
BackendApi,
ConnectionStatusType,
ContainerStatusType,
HttpStatusCodes,
@@ -32,7 +31,6 @@ import { Action } from "Shared/Telemetry/TelemetryConstants";
import { traceFailure, traceStart, traceSuccess } from "Shared/Telemetry/TelemetryProcessor";
import { userContext } from "UserContext";
import { getAuthorizationHeader } from "Utils/AuthorizationUtils";
import { useNewPortalBackendEndpoint } from "Utils/EndpointUtils";
import { queryPagesUntilContentPresent } from "Utils/QueryUtils";
import { QueryCopilotState, useQueryCopilot } from "hooks/useQueryCopilot";
import { useTabs } from "hooks/useTabs";
@@ -82,9 +80,7 @@ export const isCopilotFeatureRegistered = async (subscriptionId: string): Promis
};
export const getCopilotEnabled = async (): Promise<boolean> => {
const backendEndpoint: string = useNewPortalBackendEndpoint(BackendApi.PortalSettings)
? configContext.PORTAL_BACKEND_ENDPOINT
: configContext.BACKEND_ENDPOINT;
const backendEndpoint: string = configContext.PORTAL_BACKEND_ENDPOINT;
const url = `${backendEndpoint}/api/portalsettings/querycopilot`;
const authorizationHeader: AuthorizationTokenHeaderMetadata = getAuthorizationHeader();

View File

@@ -26,7 +26,7 @@ import { getCollectionName, getDatabaseName } from "Utils/APITypeUtils";
import { Allotment, AllotmentHandle } from "allotment";
import { useSidePanel } from "hooks/useSidePanel";
import { debounce } from "lodash";
import React, { useCallback, useEffect, useMemo, useRef, useState } from "react";
import React, { useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState } from "react";
const useSidebarStyles = makeStyles({
sidebar: {
@@ -109,6 +109,7 @@ interface GlobalCommand {
icon: JSX.Element;
onClick: () => void;
keyboardAction?: KeyboardAction;
ref?: React.RefObject<HTMLButtonElement>;
}
const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
@@ -118,6 +119,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
// However, that messes with the Menu positioning, so we need to get a reference to the 'div' to pass to the Menu.
// We can't use a ref though, because it would be set after the Menu is rendered, so we use a state value to force a re-render.
const [globalCommandButton, setGlobalCommandButton] = useState<HTMLElement | null>(null);
const primaryFocusableRef = useRef<HTMLButtonElement>(null);
const actions = useMemo<GlobalCommand[]>(() => {
if (
@@ -177,6 +179,16 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
);
}, [actions, setKeyboardActions]);
useLayoutEffect(() => {
if (primaryFocusableRef.current) {
const timer = setTimeout(() => {
primaryFocusableRef.current.focus();
}, 0);
return () => clearTimeout(timer);
}
return undefined;
}, []);
if (!primaryAction) {
return null;
}
@@ -184,7 +196,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
return (
<div className={styles.globalCommandsContainer} data-test="GlobalCommands">
{actions.length === 1 ? (
<Button icon={primaryAction.icon} onClick={onPrimaryActionClick}>
<Button icon={primaryAction.icon} onClick={onPrimaryActionClick} ref={primaryFocusableRef}>
{primaryAction.label}
</Button>
) : (
@@ -194,7 +206,7 @@ const GlobalCommands: React.FC<GlobalCommandsProps> = ({ explorer }) => {
<div ref={setGlobalCommandButton}>
<SplitButton
menuButton={{ ...triggerProps, "aria-label": "More commands" }}
primaryActionButton={{ onClick: onPrimaryActionClick }}
primaryActionButton={{ onClick: onPrimaryActionClick, ref: primaryFocusableRef }}
className={styles.globalCommandsSplitButton}
icon={primaryAction.icon}
>
@@ -282,67 +294,69 @@ export const SidebarContainer: React.FC<SidebarProps> = ({ explorer }) => {
);
return (
<Allotment ref={allotment} onChange={onChange} onDragEnd={onDragEnd} className="resourceTreeAndTabs">
{/* Collections Tree - Start */}
{hasSidebar && (
// When collapsed, we force the pane to 24 pixels wide and make it non-resizable.
<Allotment.Pane minSize={24} preferredSize={250}>
<CosmosFluentProvider className={mergeClasses(styles.sidebar)}>
<div className={styles.sidebarContainer}>
{loading && (
// The Fluent UI progress bar has some issues in reduced-motion environments so we use a simple CSS animation here.
// https://github.com/microsoft/fluentui/issues/29076
<div className={styles.loadingProgressBar} title="Refreshing tree..." />
)}
{expanded ? (
<>
<div className={styles.floatingControlsContainer}>
<div className={styles.floatingControls}>
<button
type="button"
data-test="Sidebar/RefreshButton"
className={styles.floatingControlButton}
disabled={loading}
title="Refresh"
onClick={onRefreshClick}
>
<ArrowSync12Regular />
</button>
<button
type="button"
className={styles.floatingControlButton}
title="Collapse sidebar"
onClick={() => collapse()}
>
<ChevronLeft12Regular />
</button>
<div className="sidebarContainer">
<Allotment ref={allotment} onChange={onChange} onDragEnd={onDragEnd} className="resourceTreeAndTabs">
{/* Collections Tree - Start */}
{hasSidebar && (
// When collapsed, we force the pane to 24 pixels wide and make it non-resizable.
<Allotment.Pane minSize={24} preferredSize={250}>
<CosmosFluentProvider className={mergeClasses(styles.sidebar)}>
<div className={styles.sidebarContainer}>
{loading && (
// The Fluent UI progress bar has some issues in reduced-motion environments so we use a simple CSS animation here.
// https://github.com/microsoft/fluentui/issues/29076
<div className={styles.loadingProgressBar} title="Refreshing tree..." />
)}
{expanded ? (
<>
<div className={styles.floatingControlsContainer}>
<div className={styles.floatingControls}>
<button
type="button"
data-test="Sidebar/RefreshButton"
className={styles.floatingControlButton}
disabled={loading}
title="Refresh"
onClick={onRefreshClick}
>
<ArrowSync12Regular />
</button>
<button
type="button"
className={styles.floatingControlButton}
title="Collapse sidebar"
onClick={() => collapse()}
>
<ChevronLeft12Regular />
</button>
</div>
</div>
</div>
<div
className={styles.expandedContent}
style={!hasGlobalCommands ? { gridTemplateRows: "1fr" } : undefined}
<div
className={styles.expandedContent}
style={!hasGlobalCommands ? { gridTemplateRows: "1fr" } : undefined}
>
{hasGlobalCommands && <GlobalCommands explorer={explorer} />}
<ResourceTree explorer={explorer} />
</div>
</>
) : (
<button
type="button"
className={styles.floatingControlButton}
title="Expand sidebar"
onClick={() => expand()}
>
{hasGlobalCommands && <GlobalCommands explorer={explorer} />}
<ResourceTree explorer={explorer} />
</div>
</>
) : (
<button
type="button"
className={styles.floatingControlButton}
title="Expand sidebar"
onClick={() => expand()}
>
<ChevronRight12Regular />
</button>
)}
</div>
</CosmosFluentProvider>
<ChevronRight12Regular />
</button>
)}
</div>
</CosmosFluentProvider>
</Allotment.Pane>
)}
<Allotment.Pane minSize={200}>
<Tabs explorer={explorer} />
</Allotment.Pane>
)}
<Allotment.Pane minSize={200}>
<Tabs explorer={explorer} />
</Allotment.Pane>
</Allotment>
</Allotment>
</div>
);
};

View File

@@ -39,7 +39,7 @@ export const SplashScreenButton: React.FC<SplashScreenButtonProps> = ({
role="button"
>
<div>
<img src={imgSrc} />
<img src={imgSrc} alt={title} aria-hidden="true" />
</div>
<Stack style={{ marginLeft: 16 }}>
<Text style={{ fontSize: 18, fontWeight: 600 }}>{title}</Text>

View File

@@ -3,7 +3,7 @@ import * as ko from "knockout";
import Q from "q";
import { AuthType } from "../../AuthType";
import * as Constants from "../../Common/Constants";
import { CassandraProxyAPIs, CassandraProxyEndpoints } from "../../Common/Constants";
import { CassandraProxyAPIs } from "../../Common/Constants";
import { handleError } from "../../Common/ErrorHandlingUtils";
import * as HeadersUtility from "../../Common/HeadersUtility";
import { createDocument } from "../../Common/dataAccess/createDocument";
@@ -264,9 +264,6 @@ export class CassandraAPIDataClient extends TableDataClient {
shouldNotify?: boolean,
paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> {
if (!this.useCassandraProxyEndpoint("postQuery")) {
return this.queryDocuments_ToBeDeprecated(collection, query, shouldNotify, paginationToken);
}
const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try {
@@ -309,55 +306,6 @@ export class CassandraAPIDataClient extends TableDataClient {
}
}
public async queryDocuments_ToBeDeprecated(
collection: ViewModels.Collection,
query: string,
shouldNotify?: boolean,
paginationToken?: string,
): Promise<Entities.IListTableEntitiesResult> {
const clearMessage =
shouldNotify && NotificationConsoleUtils.logConsoleProgress(`Querying rows for table ${collection.id()}`);
try {
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestQueryApi
: Constants.CassandraBackend.queryApi;
const data: any = await $.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
query,
paginationToken,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
});
shouldNotify &&
NotificationConsoleUtils.logConsoleInfo(
`Successfully fetched ${data.result.length} rows for table ${collection.id()}`,
);
return {
Results: data.result,
ContinuationToken: data.paginationToken,
};
} catch (error) {
shouldNotify &&
handleError(
error,
"QueryDocuments_ToBeDeprecated_Cassandra",
`Failed to query rows for table ${collection.id()}`,
);
throw error;
} finally {
clearMessage?.();
}
}
public async deleteDocuments(
collection: ViewModels.Collection,
entitiesToDelete: Entities.ITableEntity[],
@@ -471,10 +419,6 @@ export class CassandraAPIDataClient extends TableDataClient {
}
public getTableKeys(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!this.useCassandraProxyEndpoint("getKeys")) {
return this.getTableKeys_ToBeDeprecated(collection);
}
if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys);
}
@@ -515,52 +459,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
public getTableKeys_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKeys> {
if (!!collection.cassandraKeys) {
return Q.resolve(collection.cassandraKeys);
}
const clearInProgressMessage = logConsoleProgress(`Fetching keys for table ${collection.id()}`);
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestKeysApi
: Constants.CassandraBackend.keysApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKeys>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: CassandraTableKeys) => {
collection.cassandraKeys = data;
logConsoleInfo(`Successfully fetched keys for table ${collection.id()}`);
deferred.resolve(data);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchKeysCassandra", `Error fetching keys for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
public getTableSchema(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!this.useCassandraProxyEndpoint("getSchema")) {
return this.getTableSchema_ToBeDeprecated(collection);
}
if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema);
}
@@ -602,52 +501,7 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
public getTableSchema_ToBeDeprecated(collection: ViewModels.Collection): Q.Promise<CassandraTableKey[]> {
if (!!collection.cassandraSchema) {
return Q.resolve(collection.cassandraSchema);
}
const clearInProgressMessage = logConsoleProgress(`Fetching schema for table ${collection.id()}`);
const { databaseAccount, authType } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestSchemaApi
: Constants.CassandraBackend.schemaApi;
let endpoint = `${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`;
const deferred = Q.defer<CassandraTableKey[]>();
$.ajax(endpoint, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(databaseAccount?.properties.cassandraEndpoint),
resourceId: databaseAccount?.id,
keyspaceId: collection.databaseId,
tableId: collection.id(),
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
})
.then(
(data: any) => {
collection.cassandraSchema = data.columns;
logConsoleInfo(`Successfully fetched schema for table ${collection.id()}`);
deferred.resolve(data.columns);
},
(error: any) => {
const errorText = error.responseJSON?.message ?? JSON.stringify(error);
handleError(errorText, "FetchSchemaCassandra", `Error fetching schema for table ${collection.id()}`);
deferred.reject(errorText);
},
)
.done(clearInProgressMessage);
return deferred.promise;
}
private createOrDeleteQuery(cassandraEndpoint: string, resourceId: string, query: string): Q.Promise<any> {
if (!this.useCassandraProxyEndpoint("createOrDelete")) {
return this.createOrDeleteQuery_ToBeDeprecated(cassandraEndpoint, resourceId, query);
}
const deferred = Q.defer();
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
@@ -677,38 +531,6 @@ export class CassandraAPIDataClient extends TableDataClient {
return deferred.promise;
}
private createOrDeleteQuery_ToBeDeprecated(
cassandraEndpoint: string,
resourceId: string,
query: string,
): Q.Promise<any> {
const deferred = Q.defer();
const { authType, databaseAccount } = userContext;
const apiEndpoint: string =
authType === AuthType.EncryptedToken
? Constants.CassandraBackend.guestCreateOrDeleteApi
: Constants.CassandraBackend.createOrDeleteApi;
$.ajax(`${configContext.BACKEND_ENDPOINT}/${apiEndpoint}`, {
type: "POST",
data: {
accountName: databaseAccount?.name,
cassandraEndpoint: this.trimCassandraEndpoint(cassandraEndpoint),
resourceId: resourceId,
query: query,
},
beforeSend: this.setAuthorizationHeader as any,
cache: false,
}).then(
(data: any) => {
deferred.resolve();
},
(reason) => {
deferred.reject(reason);
},
);
return deferred.promise;
}
private trimCassandraEndpoint(cassandraEndpoint: string): string {
if (!cassandraEndpoint) {
return cassandraEndpoint;
@@ -747,19 +569,4 @@ export class CassandraAPIDataClient extends TableDataClient {
private getCassandraPartitionKeyProperty(collection: ViewModels.Collection): string {
return collection.cassandraKeys.partitionKeys[0].property;
}
private useCassandraProxyEndpoint(api: string): boolean {
const activeCassandraProxyEndpoints: string[] = [
CassandraProxyEndpoints.Development,
CassandraProxyEndpoints.Mpac,
CassandraProxyEndpoints.Prod,
CassandraProxyEndpoints.Fairfax,
CassandraProxyEndpoints.Mooncake,
];
return (
configContext.NEW_CASSANDRA_APIS?.includes(api) &&
activeCassandraProxyEndpoints.includes(configContext.CASSANDRA_PROXY_ENDPOINT)
);
}
}

View File

@@ -0,0 +1,126 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { IDisposable, ITerminalAddon, Terminal } from 'xterm';
interface IAttachOptions {
bidirectional?: boolean;
}
export class AttachAddon implements ITerminalAddon {
private _socket: WebSocket;
private _bidirectional: boolean;
private _disposables: IDisposable[] = [];
private _socketData: string;
constructor(socket: WebSocket, options?: IAttachOptions) {
this._socket = socket;
// always set binary type to arraybuffer, we do not handle blobs
this._socket.binaryType = 'arraybuffer';
this._bidirectional = !(options && options.bidirectional === false);
this._socketData = '';
}
public activate(terminal: Terminal): void {
this._disposables.push(
addSocketListener(this._socket, 'message', ev => {
let data: ArrayBuffer | string = ev.data;
const startStatusJson = 'ie_us';
const endStatusJson = 'ie_ue';
if (typeof data === 'object') {
const enc = new TextDecoder("utf-8");
data = enc.decode(ev.data as any);
}
// for example of json object look in TerminalHelper in the socket.onMessage
if (data.includes(startStatusJson) && data.includes(endStatusJson)) {
// process as one line
const statusData = data.split(startStatusJson)[1].split(endStatusJson)[0];
data = data.replace(statusData, '');
data = data.replace(startStatusJson, '');
data = data.replace(endStatusJson, '');
} else if (data.includes(startStatusJson)) {
// check for start
const partialStatusData = data.split(startStatusJson)[1];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(startStatusJson, '');
} else if (data.includes(endStatusJson)) {
// check for end and process the command
const partialStatusData = data.split(endStatusJson)[0];
this._socketData += partialStatusData;
data = data.replace(partialStatusData, '');
data = data.replace(endStatusJson, '');
this._socketData = '';
} else if (this._socketData.length > 0) {
// check if the line is all data then just concatenate
this._socketData += data;
data = '';
}
terminal.write(data);
})
);
if (this._bidirectional) {
this._disposables.push(terminal.onData(data => this._sendData(data)));
this._disposables.push(terminal.onBinary(data => this._sendBinary(data)));
}
this._disposables.push(addSocketListener(this._socket, 'close', () => this.dispose()));
this._disposables.push(addSocketListener(this._socket, 'error', () => this.dispose()));
}
public dispose(): void {
for (const d of this._disposables) {
d.dispose();
}
}
private _sendData(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
this._socket.send(data);
}
private _sendBinary(data: string): void {
if (!this._checkOpenSocket()) {
return;
}
const buffer = new Uint8Array(data.length);
for (let i = 0; i < data.length; ++i) {
buffer[i] = data.charCodeAt(i) & 255;
}
this._socket.send(buffer);
}
private _checkOpenSocket(): boolean {
switch (this._socket.readyState) {
case WebSocket.OPEN:
return true;
case WebSocket.CONNECTING:
throw new Error('Attach addon was loaded before socket was open');
case WebSocket.CLOSING:
return false;
case WebSocket.CLOSED:
throw new Error('Attach addon socket is closed');
default:
throw new Error('Unexpected socket state');
}
}
}
function addSocketListener<K extends keyof WebSocketEventMap>(socket: WebSocket, type: K, handler: (this: WebSocket, ev: WebSocketEventMap[K]) => any): IDisposable {
socket.addEventListener(type, handler);
return {
dispose: () => {
if (!handler) {
// Already disposed
return;
}
socket.removeEventListener(type, handler);
}
};
}

View File

@@ -0,0 +1,76 @@
import React, { useEffect, useRef } from "react";
import { Terminal } from "xterm";
import { FitAddon } from 'xterm-addon-fit';
import "xterm/css/xterm.css";
import { TerminalKind } from "../../../Contracts/ViewModels";
import { startCloudShellTerminal } from "./Core/CloudShellTerminalCore";
export interface CloudShellTerminalProps {
shellType: TerminalKind;
}
export const CloudShellTerminalComponent: React.FC<CloudShellTerminalProps> = ({
shellType
}: CloudShellTerminalProps) => {
const terminalRef = useRef(null); // Reference for terminal container
const xtermRef = useRef(null); // Reference for XTerm instance
const socketRef = useRef(null); // Reference for WebSocket
const fitAddon = new FitAddon();
useEffect(() => {
// Initialize XTerm instance
const term = new Terminal({
cursorBlink: true,
cursorStyle: 'bar',
fontFamily: 'monospace',
fontSize: 14,
theme: {
background: "#1e1e1e",
foreground: "#d4d4d4",
cursor: "#ffcc00"
},
scrollback: 1000
});
term.loadAddon(fitAddon);
// Attach terminal to the DOM
if (terminalRef.current) {
term.open(terminalRef.current);
xtermRef.current = term;
}
if (fitAddon) {
fitAddon.fit();
}
// Adjust terminal size on window resize
const handleResize = () => fitAddon.fit();
window.addEventListener('resize', handleResize);
try {
socketRef.current = startCloudShellTerminal(term, shellType);
term.onData((data) => {
if (socketRef.current && socketRef.current.readyState === WebSocket.OPEN) {
socketRef.current.send(data);
}
});
} catch (error) {
console.error("Failed to initialize CloudShell terminal:", error);
term.writeln(`\x1B[31mError initializing terminal: ${error.message}\x1B[0m`);
}
// Cleanup function to close WebSocket and dispose terminal
return () => {
if (!socketRef.current) return;
if (socketRef.current) {
socketRef.current.close(); // Close WebSocket connection
}
window.removeEventListener('resize', handleResize);
term.dispose(); // Clean up XTerm instance
};
}, [shellType]);
return <div ref={terminalRef} style={{ width: "100%", height: "500px"}} />;
};

View File

@@ -0,0 +1,152 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { TerminalKind } from "../../../Contracts/ViewModels";
import { userContext } from "../../../UserContext";
export const getCommands = (terminalKind: TerminalKind, key: string) => {
let dbAccount = userContext.databaseAccount;
let endpoint;
switch (terminalKind) {
case TerminalKind.Postgres:
endpoint = dbAccount.properties.postgresqlEndpoint;
break;
case TerminalKind.Mongo:
endpoint = dbAccount.properties.mongoEndpoint;
break;
case TerminalKind.VCoreMongo:
endpoint = dbAccount.properties.vcoreMongoEndpoint;
break;
case TerminalKind.Cassandra:
endpoint = dbAccount.properties.cassandraEndpoint;
break;
default:
throw new Error("Unknown Terminal Kind");
}
let config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return commands(terminalKind, config).join("\n").concat("\n");
};
export interface CommandConfig {
host: string,
name: string,
password: string,
endpoint: string
}
export const commands = (terminalKind: TerminalKind, config?: CommandConfig): string[] => {
switch (terminalKind) {
case TerminalKind.Postgres:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if psql is installed; if not, proceed with installation
"if ! command -v psql &> /dev/null; then echo '⚠️ psql not found. Installing...'; fi",
// 3. Download PostgreSQL if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.postgresql.org/pub/source/v15.2/postgresql-15.2.tar.bz2; fi",
// 4. Extract PostgreSQL package if not installed
"if ! command -v psql &> /dev/null; then tar -xvjf postgresql-15.2.tar.bz2; fi",
// 5. Create a directory for PostgreSQL installation if not installed
"if ! command -v psql &> /dev/null; then mkdir -p ~/pgsql; fi",
// 6. Download readline (dependency for PostgreSQL) if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.gnu.org/gnu/readline/readline-8.1.tar.gz; fi",
// 7. Extract readline package if not installed
"if ! command -v psql &> /dev/null; then tar -xvzf readline-8.1.tar.gz; fi",
// 8. Configure readline if not installed
"if ! command -v psql &> /dev/null; then cd readline-8.1 && ./configure --prefix=$HOME/pgsql; fi",
// 9. Add PostgreSQL to PATH if not installed
"if ! command -v psql &> /dev/null; then echo 'export PATH=$HOME/pgsql/bin:$PATH' >> ~/.bashrc; fi",
// 10. Source .bashrc to update PATH (even if psql was already installed)
"source ~/.bashrc",
// 11. Verify PostgreSQL installation
"psql --version",
`psql 'read -p "Enter Database Name: " dbname && read -p "Enter Username: " username && host=${config.endpoint} port=5432 dbname=$dbname user=$username sslmode=require'`
];
case TerminalKind.Mongo:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`mongosh --host ${config.host} --port 10255 --username ${config.name} --password ${config.password} --tls --tlsAllowInvalidCertificates`
];
case TerminalKind.VCoreMongo:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 10. Login to MongoDBmongosh mongodb+srv://<credentials>@neesharma-stage-mongo-vcore.mongocluster.cosmos.azure.com/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000\u0007
`read -p "Enter username: " username && mongosh "mongodb+srv://$username:@${config.endpoint}/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000" --tls --tlsAllowInvalidCertificates`
];
case TerminalKind.Cassandra:
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if cqlsh is installed; if not, proceed with installation
"if ! command -v cqlsh &> /dev/null; then echo '⚠️ cqlsh not found. Installing...'; fi",
// 3. Download Cassandra if not installed
"if ! command -v cqlsh &> /dev/null; then curl -LO https://archive.apache.org/dist/cassandra/5.0.3/apache-cassandra-5.0.3-bin.tar.gz; fi",
// 4. Extract Cassandra package if not installed
"if ! command -v cqlsh &> /dev/null; then tar -xvzf apache-cassandra-5.0.3-bin.tar.gz; fi",
// 5. Move Cassandra binaries if not installed
"if ! command -v cqlsh &> /dev/null; then mkdir -p ~/cassandra && mv apache-cassandra-5.0.3/* ~/cassandra/; fi",
// 6. Add Cassandra to PATH if not installed
"if ! command -v cqlsh &> /dev/null; then echo 'export PATH=$HOME/cassandra/bin:$PATH' >> ~/.bashrc; fi",
// 7. Set environment variables for SSL
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VERSION=TLSv1_2' >> ~/.bashrc; fi",
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VALIDATE=false' >> ~/.bashrc; fi",
// 8. Source .bashrc to update PATH (even if cqlsh was already installed)
"source ~/.bashrc",
// 9. Verify cqlsh installation
"cqlsh --version",
// 10. Login to Cassandra
`cqlsh ${config.host} 10350 -u ${config.name} -p ${config.password} --ssl --protocol-version=4`
];
default:
return ["echo Unknown Shell"];
}
}
const getHostFromUrl = (mongoEndpoint: string): string => {
try {
const url = new URL(mongoEndpoint);
return url.hostname;
} catch (error) {
console.error("Invalid Mongo Endpoint URL:", error);
return "";
}
};

View File

@@ -0,0 +1,393 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Core functionality for CloudShell terminal management
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import {
authorizeSession,
connectTerminal,
provisionConsole,
putEphemeralUserSettings,
registerCloudShellProvider,
verifyCloudShellProviderRegistration
} from "../Data/CloudShellApiClient";
import { getNormalizedRegion } from "../Data/RegionUtils";
import { ShellTypeHandler } from "../ShellTypes/ShellTypeFactory";
import { AttachAddon } from "../Utils/AttachAddOn";
import { wait } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
// Constants
const DEFAULT_CLOUDSHELL_REGION = "westus";
const POLLING_INTERVAL_MS = 5000;
const MAX_RETRY_COUNT = 10;
const MAX_PING_COUNT = 20 * 60; // 20 minutes (60 seconds/minute)
/**
* Main function to start a CloudShell terminal
*/
export const startCloudShellTerminal = async (terminal: Terminal, shellType: TerminalKind) => {
// Get the shell handler for this type
const shellHandler = ShellTypeHandler.getHandler(shellType);
terminal.writeln(terminalLog.header("Initializing Azure CloudShell"));
await ensureCloudShellProviderRegistered(terminal);
const { resolvedRegion, defaultCloudShellRegion } = determineCloudShellRegion(terminal);
// Ask for user consent for region
const consentGranted = await askForRegionConsent(terminal, resolvedRegion);
if (!consentGranted) {
return {}; // Exit if user declined
}
// Check network requirements for this shell type
const networkConfig = await shellHandler.configureNetworkAccess(terminal, resolvedRegion);
terminal.writeln("");
// Provision CloudShell session
terminal.writeln(terminalLog.cloudshell(`Provisioning Started....`));
let sessionDetails: {
socketUri?: string;
provisionConsoleResponse?: any;
targetUri?: string;
};
try {
sessionDetails = await provisionCloudShellSession(resolvedRegion, terminal, networkConfig.vNetSettings, networkConfig.isAllPublicAccessEnabled);
} catch (err) {
terminal.writeln(terminalLog.error(err));
terminal.writeln(terminalLog.error("Failed to provision in primary region"));
terminal.writeln(terminalLog.warning(`Attempting with fallback region: ${defaultCloudShellRegion}`));
sessionDetails = await provisionCloudShellSession(defaultCloudShellRegion, terminal, networkConfig.vNetSettings, networkConfig.isAllPublicAccessEnabled);
}
if (!sessionDetails.socketUri) {
terminal.writeln(terminalLog.error('Unable to provision console. Please try again later.'));
return {};
}
// Configure WebSocket connection with shell-specific commands
const socket = await establishTerminalConnection(
terminal,
shellHandler,
sessionDetails.socketUri,
sessionDetails.provisionConsoleResponse,
sessionDetails.targetUri
);
return socket;
};
/**
* Ensures that the CloudShell provider is registered for the current subscription
*/
export const ensureCloudShellProviderRegistered = async (terminal: Terminal): Promise<void> => {
try {
terminal.writeln(terminalLog.info("Verifying provider registration..."));
const response: any = await verifyCloudShellProviderRegistration(userContext.subscriptionId);
if (response.registrationState !== "Registered") {
terminal.writeln(terminalLog.warning("Registering CloudShell provider..."));
await registerCloudShellProvider(userContext.subscriptionId);
terminal.writeln(terminalLog.success("Provider registration successful"));
}
} catch (err) {
terminal.writeln(terminalLog.error("Unable to verify provider registration"));
throw err;
}
};
/**
* Determines the appropriate CloudShell region
*/
export const determineCloudShellRegion = (terminal: Terminal): { resolvedRegion: string; defaultCloudShellRegion: string } => {
const region = userContext.databaseAccount?.location;
const resolvedRegion = getNormalizedRegion(region, DEFAULT_CLOUDSHELL_REGION);
return { resolvedRegion, defaultCloudShellRegion: DEFAULT_CLOUDSHELL_REGION };
};
/**
* Asks the user for consent to use the specified CloudShell region
*/
export const askForRegionConsent = async (terminal: Terminal, resolvedRegion: string): Promise<boolean> => {
terminal.writeln(terminalLog.header("CloudShell Region Confirmation"));
terminal.writeln(terminalLog.info("The CloudShell container will be provisioned in a specific Azure region."));
// Data residency and compliance information
terminal.writeln(terminalLog.subheader("Important Information"));
const dbRegion = userContext.databaseAccount?.location || "unknown";
terminal.writeln(terminalLog.item("Database Region", dbRegion));
terminal.writeln(terminalLog.item("CloudShell Container Region", resolvedRegion));
terminal.writeln(terminalLog.subheader("What this means to you?"));
terminal.writeln(terminalLog.item("Data Residency", "Commands and query results will be processed in this region"));
terminal.writeln(terminalLog.item("Network", "Database connections will originate from this region"));
// Consent question
terminal.writeln("");
terminal.writeln(terminalLog.prompt("Would you like to provision Azure CloudShell in the '" + resolvedRegion + "' region?"));
terminal.writeln(terminalLog.prompt("Press 'Y' to continue or 'N' to cancel (Y/N)"));
return new Promise<boolean>((resolve) => {
const keyListener = terminal.onKey(({ key }: { key: string }) => {
keyListener.dispose();
terminal.writeln("");
if (key.toLowerCase() === 'y') {
terminal.writeln(terminalLog.success("Proceeding with CloudShell in " + resolvedRegion));
terminal.writeln(terminalLog.separator());
resolve(true);
} else {
terminal.writeln(terminalLog.error("CloudShell provisioning canceled"));
setTimeout(() => terminal.dispose(), 2000);
resolve(false);
}
});
});
};
/**
* Provisions a CloudShell session
*/
export const provisionCloudShellSession = async (
resolvedRegion: string,
terminal: Terminal,
vNetSettings: object,
isAllPublicAccessEnabled: boolean
): Promise<{ socketUri?: string; provisionConsoleResponse?: any; targetUri?: string }> => {
return new Promise( async (resolve, reject) => {
try {
terminal.writeln(terminalLog.header("Configuring CloudShell Session"));
// Check if vNetSettings is available and not empty
const hasVNetSettings = vNetSettings && Object.keys(vNetSettings).length > 0;
if (hasVNetSettings) {
terminal.writeln(terminalLog.vnet("Enabling private network configuration"));
displayNetworkSettings(terminal, vNetSettings, resolvedRegion);
}
else {
terminal.writeln(terminalLog.warning("No VNet configuration provided"));
terminal.writeln(terminalLog.warning("CloudShell will be provisioned with public network access"));
if (!isAllPublicAccessEnabled) {
terminal.writeln(terminalLog.error("Warning: Your database has network restrictions"));
terminal.writeln(terminalLog.error("CloudShell may not be able to connect without proper VNet configuration"));
}
}
terminal.writeln(terminalLog.warning("Any previous VNet settings will be overridden"));
// Apply user settings
await putEphemeralUserSettings(userContext.subscriptionId, resolvedRegion, vNetSettings);
terminal.writeln(terminalLog.success("Session settings applied"));
// Provision console
let provisionConsoleResponse;
let attemptCounter = 0;
do {
provisionConsoleResponse = await provisionConsole(userContext.subscriptionId, resolvedRegion);
terminal.writeln(terminalLog.progress("Provisioning", provisionConsoleResponse.properties.provisioningState));
attemptCounter++;
if (provisionConsoleResponse.properties.provisioningState !== "Succeeded") {
await wait(POLLING_INTERVAL_MS);
}
} while (provisionConsoleResponse.properties.provisioningState !== "Succeeded" && attemptCounter < 10);
if (provisionConsoleResponse.properties.provisioningState !== "Succeeded") {
const errorMessage = `Provisioning failed: ${provisionConsoleResponse.properties.provisioningState}`;
terminal.writeln(terminalLog.error(errorMessage));
return reject(new Error(errorMessage));
}
// Connect terminal
const connectTerminalResponse = await connectTerminal(
provisionConsoleResponse.properties.uri,
{ rows: terminal.rows, cols: terminal.cols }
);
const targetUri = `${provisionConsoleResponse.properties.uri}/terminals?cols=${terminal.cols}&rows=${terminal.rows}&version=2019-01-01&shell=bash`;
const termId = connectTerminalResponse.id;
// Determine socket URI
let socketUri = connectTerminalResponse.socketUri.replace(":443/", "");
const targetUriBody = targetUri.replace('https://', '').split('?')[0];
if (socketUri.indexOf(targetUriBody) === -1) {
socketUri = `wss://${targetUriBody}/${termId}`;
}
if (targetUriBody.includes('servicebus')) {
const targetUriBodyArr = targetUriBody.split('/');
socketUri = `wss://${targetUriBodyArr[0]}/$hc/${targetUriBodyArr[1]}/terminals/${termId}`;
}
return resolve({ socketUri, provisionConsoleResponse, targetUri });
} catch (err) {
terminal.writeln(terminalLog.error(`Provisioning failed: ${err.message}`));
return reject(err);
}
});
};
/**
* Display VNet settings in the terminal
*/
const displayNetworkSettings = (terminal: Terminal, vNetSettings: any, resolvedRegion: string): void => {
if (vNetSettings.networkProfileResourceId) {
const profileName = vNetSettings.networkProfileResourceId.split('/').pop();
terminal.writeln(terminalLog.item("Network Profile", profileName));
if (vNetSettings.relayNamespaceResourceId) {
const relayName = vNetSettings.relayNamespaceResourceId.split('/').pop();
terminal.writeln(terminalLog.item("Relay Namespace", relayName));
}
terminal.writeln(terminalLog.item("Region", resolvedRegion));
terminal.writeln(terminalLog.success("CloudShell will use this VNet to connect to your database"));
}
};
/**
* Establishes a terminal connection via WebSocket
*/
export const establishTerminalConnection = async (
terminal: Terminal,
shellHandler: any,
socketUri: string,
provisionConsoleResponse: any,
targetUri: string
): Promise<WebSocket> => {
let socket = new WebSocket(socketUri);
// Get shell-specific initial commands
const initCommands = await shellHandler.getInitialCommands();
// Configure the socket
socket = configureSocketConnection(socket, socketUri, terminal, initCommands, 0);
// Attach the terminal addon
const attachAddon = new AttachAddon(socket);
terminal.loadAddon(attachAddon);
terminal.writeln(terminalLog.success("Connection established"));
// Authorize the session
try {
const authorizeResponse = await authorizeSession(provisionConsoleResponse.properties.uri);
const cookieToken = authorizeResponse.token;
// Load auth token with a hidden image
const img = document.createElement("img");
img.src = `${targetUri}&token=${encodeURIComponent(cookieToken)}`;
terminal.focus();
} catch (err) {
terminal.writeln(terminalLog.error("Authorization failed"));
socket.close();
throw err;
}
return socket;
};
/**
* Configures a WebSocket connection for the terminal
*/
export const configureSocketConnection = (
socket: WebSocket,
uri: string,
terminal: Terminal,
initCommands: string,
socketRetryCount: number
): WebSocket => {
let jsonData = '';
let keepAliveID: NodeJS.Timeout = null;
let pingCount = 0;
sendTerminalStartupCommands(socket, initCommands);
socket.onclose = () => {
if (keepAliveID) {
clearTimeout(keepAliveID);
pingCount = 0;
}
terminal.writeln(terminalLog.warning("Session terminated. Refresh the page to start a new session."));
};
socket.onerror = () => {
if (socketRetryCount < MAX_RETRY_COUNT && socket.readyState !== WebSocket.CLOSED) {
configureSocketConnection(socket, uri, terminal, initCommands, socketRetryCount + 1);
} else {
socket.close();
}
};
socket.onmessage = (event: MessageEvent<string>) => {
pingCount = 0; // Reset ping count on message receipt
let eventData = '';
if (typeof event.data === "object") {
try {
const enc = new TextDecoder("utf-8");
eventData = enc.decode(event.data as any);
} catch (e) {
// Not an array buffer, ignore
}
}
if (typeof event.data === 'string') {
eventData = event.data;
}
// Process event data
if (eventData.includes("ie_us") && eventData.includes("ie_ue")) {
const statusData = eventData.split('ie_us')[1].split('ie_ue')[0];
console.log(statusData);
} else if (eventData.includes("ie_us")) {
jsonData += eventData.split('ie_us')[1];
} else if (eventData.includes("ie_ue")) {
jsonData += eventData.split('ie_ue')[0];
console.log(jsonData);
jsonData = '';
} else if (jsonData.length > 0) {
jsonData += eventData;
}
};
return socket;
};
/**
* Sends startup commands to the terminal
*/
export const sendTerminalStartupCommands = (socket: WebSocket, initCommands: string): void => {
let keepAliveID: NodeJS.Timeout = null;
let pingCount = 0;
if (socket && socket.readyState === WebSocket.OPEN) {
socket.send(initCommands);
} else {
socket.onopen = () => {
socket.send(initCommands);
const keepSocketAlive = (socket: WebSocket) => {
if (socket.readyState === WebSocket.OPEN) {
if (pingCount >= MAX_PING_COUNT) {
socket.close();
} else {
socket.send('');
pingCount++;
keepAliveID = setTimeout(() => keepSocketAlive(socket), 1000);
}
}
};
keepSocketAlive(socket);
};
}
};

View File

@@ -0,0 +1,320 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { ApiVersionsConfig, ResourceType } from "Explorer/Tabs/CloudShellTab/DataModels";
import { v4 as uuidv4 } from 'uuid';
import { configContext } from "../../../ConfigContext";
import { TerminalKind } from "../../../Contracts/ViewModels";
import { userContext } from '../../../UserContext';
import { armRequest } from "../../../Utils/arm/request";
import { Authorization, ConnectTerminalResponse, NetworkType, OsType, ProvisionConsoleResponse, SessionType, Settings, ShellType } from "./DataModels";
/**
* API version configuration by terminal type and resource type
*/
const API_VERSIONS : ApiVersionsConfig = {
// Default version for fallback
DEFAULT: "2024-07-01",
// Resource type specific defaults
RESOURCE_DEFAULTS: {
[ResourceType.NETWORK]: "2023-05-01",
[ResourceType.DATABASE]: "2024-07-01",
[ResourceType.VNET]: "2023-05-01",
[ResourceType.SUBNET]: "2023-05-01",
[ResourceType.RELAY]: "2024-01-01",
[ResourceType.ROLE]: "2022-04-01"
},
// Shell-type specific versions with resource overrides
SHELL_TYPES: {
[TerminalKind.Mongo]: {
[ResourceType.DATABASE]: "2024-11-15"
},
[TerminalKind.VCoreMongo]: {
[ResourceType.DATABASE]: "2024-07-01"
},
[TerminalKind.Cassandra]: {
[ResourceType.DATABASE]: "2024-11-15"
}
}
};
export const validateUserSettings = (userSettings: Settings) => {
if (userSettings.sessionType !== SessionType.Ephemeral && userSettings.osType !== OsType.Linux) {
return false;
} else {
return true;
}
}
// Current shell type context
let currentShellType: TerminalKind | null = null;
/**
* Set the active shell type to determine API version
*/
export const setShellType = (shellType: TerminalKind): void => {
currentShellType = shellType;
};
/**
* Get the appropriate API version based on shell type and resource type
* Uses a cascading fallback approach for maximum flexibility
*/
export const getApiVersion = (resourceType?: ResourceType): string => {
// If no shell type is set, fallback to resource default or global default
if (!currentShellType) {
return resourceType ?
(API_VERSIONS.RESOURCE_DEFAULTS[resourceType] || API_VERSIONS.DEFAULT) :
API_VERSIONS.DEFAULT;
}
// Shell type is set, try to get specific version in this priority:
// 1. Shell-specific + resource-specific
if (resourceType &&
API_VERSIONS.SHELL_TYPES[currentShellType]) {
const shellTypeConfig = API_VERSIONS.SHELL_TYPES[currentShellType];
if (resourceType in shellTypeConfig) {
return shellTypeConfig[resourceType] as string;
}
}
// 2. Resource-specific default
if (resourceType && resourceType in API_VERSIONS.RESOURCE_DEFAULTS) {
return API_VERSIONS.RESOURCE_DEFAULTS[resourceType];
}
// 3. Global default
return API_VERSIONS.DEFAULT;
};
export const getUserRegion = async (subscriptionId: string, resourceGroup: string, accountName: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.DocumentDB/databaseAccounts/${accountName}`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const deleteUserSettings = async (): Promise<void> => {
await armRequest<void>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "DELETE",
apiVersion: "2023-02-01-preview"
});
};
export const getUserSettings = async (): Promise<Settings> => {
const resp = await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "GET",
apiVersion: "2023-02-01-preview"
});
return resp;
};
export const putEphemeralUserSettings = async (userSubscriptionId: string, userRegion: string, vNetSettings?: object) => {
const ephemeralSettings = {
properties: {
preferredOsType: OsType.Linux,
preferredShellType: ShellType.Bash,
preferredLocation: userRegion,
networkType: (!vNetSettings || Object.keys(vNetSettings).length === 0) ? NetworkType.Default : (vNetSettings ? NetworkType.Isolated : NetworkType.Default),
sessionType: SessionType.Ephemeral,
userSubscription: userSubscriptionId,
vnetSettings: vNetSettings ?? {}
}
};
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "PUT",
apiVersion: "2023-02-01-preview",
body: ephemeralSettings
});
};
export const verifyCloudShellProviderRegistration = async(subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const registerCloudShellProvider = async (subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell/register`,
method: "POST",
apiVersion: "2022-12-01"
});
};
export const provisionConsole = async (subscriptionId: string, location: string): Promise<ProvisionConsoleResponse> => {
const data = {
properties: {
osType: OsType.Linux
}
};
return await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `providers/Microsoft.Portal/consoles/default`,
method: "PUT",
apiVersion: "2023-02-01-preview",
customHeaders: {
'x-ms-console-preferred-location': location
},
body: data,
});
};
export const connectTerminal = async (consoleUri: string, size: { rows: number, cols: number }): Promise<ConnectTerminalResponse> => {
const targetUri = consoleUri + `/terminals?cols=${size.cols}&rows=${size.rows}&version=2019-01-01&shell=bash`;
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
'Content-Length': '2',
'Authorization': userContext.authorizationToken,
'x-ms-client-request-id': uuidv4(),
'Accept-Language': getLocale(),
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const authorizeSession = async (consoleUri: string): Promise<Authorization> => {
const targetUri = consoleUri + "/authorize";
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Authorization': userContext.authorizationToken,
'Accept-Language': getLocale(),
"Content-Type": 'application/json'
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const getLocale = () => {
const langLocale = navigator.language;
return (langLocale && langLocale.length === 2 ? langLocale[1] : 'en-us');
};
const validCloudShellRegions = new Set(["westus", "southcentralus", "eastus", "northeurope", "westeurope", "centralindia", "southeastasia", "westcentralus"]);
export const getNormalizedRegion = (region: string, defaultCloudshellRegion: string) => {
if (!region) return defaultCloudshellRegion;
const regionMap: Record<string, string> = {
"centralus": "westcentralus",
"eastus2": "eastus"
};
const normalizedRegion = regionMap[region.toLowerCase()] || region;
return validCloudShellRegions.has(normalizedRegion.toLowerCase()) ? normalizedRegion : defaultCloudshellRegion;
};
export async function getNetworkProfileInfo<T>(networkProfileResourceId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await GetARMCall<T>(networkProfileResourceId, apiVersion);
}
export async function getAccountDetails<T>(databaseAccountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(databaseAccountId, apiVersion);
}
export async function getVnetInformation<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function getSubnetInformation<T>(subnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await GetARMCall<T>(subnetId, apiVersion);
}
export async function updateSubnetInformation<T>(subnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await PutARMCall(subnetId, request, apiVersion);
}
export async function updateDatabaseAccount<T>(accountId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await PutARMCall(accountId, request, apiVersion);
}
export async function getDatabaseOperations<T>(accountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(`${accountId}/operations`, apiVersion);
}
export async function updateVnet<T>(vnetId: string, request: object, apiVersionOverride?: string) {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await PutARMCall<T>(vnetId, request, apiVersion);
}
export async function getVnet<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function createNetworkProfile<T>(networkProfileId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(networkProfileId, request, apiVersion);
}
export async function createRelay<T>(relayId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await PutARMCall<T>(relayId, request, apiVersion);
}
export async function getRelay<T>(relayId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await GetARMCall<T>(relayId, apiVersion);
}
export async function createRoleOnNetworkProfile<T>(roleid: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleid, request, apiVersion);
}
export async function createRoleOnRelay<T>(roleid: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleid, request, apiVersion);
}
export async function GetARMCall<T>(path: string, apiVersion: string = API_VERSIONS.DEFAULT): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "GET",
apiVersion: apiVersion
});
}
export async function PutARMCall<T>(path: string, request: object, apiVersion: string = API_VERSIONS.DEFAULT): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "PUT",
apiVersion: apiVersion,
body: request
});
}

View File

@@ -0,0 +1,263 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* CloudShell API client for various operations
*/
import { v4 as uuidv4 } from 'uuid';
import { configContext } from "../../../../ConfigContext";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from '../../../../UserContext';
import { armRequest } from "../../../../Utils/arm/request";
import { ApiVersionsConfig, DEFAULT_API_VERSIONS } from "../Models/ApiVersions";
import { Authorization, ConnectTerminalResponse, NetworkType, OsType, ProvisionConsoleResponse, ResourceType, SessionType, Settings, ShellType } from "../Models/DataModels";
import { getLocale } from '../Data/LocalizationUtils';
// Current shell type context
let currentShellType: TerminalKind | null = null;
/**
* Set the active shell type to determine API version
*/
export const setShellType = (shellType: TerminalKind): void => {
currentShellType = shellType;
};
/**
* Get the appropriate API version based on shell type and resource type
*/
export const getApiVersion = (resourceType?: ResourceType, apiVersions?: ApiVersionsConfig): string => {
if (!apiVersions) {
apiVersions = DEFAULT_API_VERSIONS; // Default fallback
}
// Shell type is set, try to get specific version in this priority:
// 1. Shell-specific + resource-specific
if (resourceType &&
apiVersions.SHELL_TYPES[currentShellType]) {
const shellTypeConfig = apiVersions.SHELL_TYPES[currentShellType];
if (resourceType in shellTypeConfig) {
return shellTypeConfig[resourceType] as string;
}
}
// 2. Resource-specific default
if (resourceType && resourceType in apiVersions.RESOURCE_DEFAULTS) {
return apiVersions.RESOURCE_DEFAULTS[resourceType];
}
// 3. Global default
return apiVersions.DEFAULT;
};
export const getUserRegion = async (subscriptionId: string, resourceGroup: string, accountName: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.DocumentDB/databaseAccounts/${accountName}`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const deleteUserSettings = async (): Promise<void> => {
await armRequest<void>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "DELETE",
apiVersion: "2023-02-01-preview"
});
};
export const getUserSettings = async (): Promise<Settings> => {
const resp = await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "GET",
apiVersion: "2023-02-01-preview"
});
return resp;
};
export const putEphemeralUserSettings = async (userSubscriptionId: string, userRegion: string, vNetSettings?: object) => {
const ephemeralSettings = {
properties: {
preferredOsType: OsType.Linux,
preferredShellType: ShellType.Bash,
preferredLocation: userRegion,
networkType: (!vNetSettings || Object.keys(vNetSettings).length === 0) ? NetworkType.Default : (vNetSettings ? NetworkType.Isolated : NetworkType.Default),
sessionType: SessionType.Ephemeral,
userSubscription: userSubscriptionId,
vnetSettings: vNetSettings ?? {}
}
};
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/providers/Microsoft.Portal/userSettings/cloudconsole`,
method: "PUT",
apiVersion: "2023-02-01-preview",
body: ephemeralSettings
});
};
export const verifyCloudShellProviderRegistration = async(subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell`,
method: "GET",
apiVersion: "2022-12-01"
});
};
export const registerCloudShellProvider = async (subscriptionId: string) => {
return await armRequest({
host: configContext.ARM_ENDPOINT,
path: `/subscriptions/${subscriptionId}/providers/Microsoft.CloudShell/register`,
method: "POST",
apiVersion: "2022-12-01"
});
};
export const provisionConsole = async (subscriptionId: string, location: string): Promise<ProvisionConsoleResponse> => {
const data = {
properties: {
osType: OsType.Linux
}
};
return await armRequest<any>({
host: configContext.ARM_ENDPOINT,
path: `providers/Microsoft.Portal/consoles/default`,
method: "PUT",
apiVersion: "2023-02-01-preview",
customHeaders: {
'x-ms-console-preferred-location': location
},
body: data,
});
};
export const connectTerminal = async (consoleUri: string, size: { rows: number, cols: number }): Promise<ConnectTerminalResponse> => {
const targetUri = consoleUri + `/terminals?cols=${size.cols}&rows=${size.rows}&version=2019-01-01&shell=bash`;
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
'Content-Length': '2',
'Authorization': userContext.authorizationToken,
'x-ms-client-request-id': uuidv4(),
'Accept-Language': getLocale(),
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export const authorizeSession = async (consoleUri: string): Promise<Authorization> => {
const targetUri = consoleUri + "/authorize";
const resp = await fetch(targetUri, {
method: "POST",
headers: {
'Accept': 'application/json',
'Authorization': userContext.authorizationToken,
'Accept-Language': getLocale(),
"Content-Type": 'application/json'
},
body: "{}" // empty body is necessary
});
return resp.json();
};
export async function getNetworkProfileInfo<T>(networkProfileResourceId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await GetARMCall<T>(networkProfileResourceId, apiVersion);
}
export async function getAccountDetails<T>(databaseAccountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(databaseAccountId, apiVersion);
}
export async function getVnetInformation<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function getSubnetInformation<T>(subnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await GetARMCall<T>(subnetId, apiVersion);
}
export async function updateSubnetInformation<T>(subnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.SUBNET);
return await PutARMCall<T>(subnetId, request, apiVersion);
}
export async function updateDatabaseAccount<T>(accountId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await PutARMCall<T>(accountId, request, apiVersion);
}
export async function getDatabaseOperations<T>(accountId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.DATABASE);
return await GetARMCall<T>(`${accountId}/operations`, apiVersion);
}
export async function updateVnet<T>(vnetId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await PutARMCall<T>(vnetId, request, apiVersion);
}
export async function getVnet<T>(vnetId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.VNET);
return await GetARMCall<T>(vnetId, apiVersion);
}
export async function createNetworkProfile<T>(networkProfileId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(networkProfileId, request, apiVersion);
}
export async function createRelay<T>(relayId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await PutARMCall<T>(relayId, request, apiVersion);
}
export async function getRelay<T>(relayId: string, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.RELAY);
return await GetARMCall<T>(relayId, apiVersion);
}
export async function createRoleOnNetworkProfile<T>(roleId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleId, request, apiVersion);
}
export async function createRoleOnRelay<T>(roleId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.ROLE);
return await PutARMCall<T>(roleId, request, apiVersion);
}
export async function createPrivateEndpoint<T>(privateEndpointId: string, request: object, apiVersionOverride?: string): Promise<T> {
const apiVersion = apiVersionOverride || getApiVersion(ResourceType.NETWORK);
return await PutARMCall<T>(privateEndpointId, request, apiVersion);
}
export async function GetARMCall<T>(path: string, apiVersion: string = '2024-07-01'): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "GET",
apiVersion: apiVersion
});
}
export async function PutARMCall<T>(path: string, request: object, apiVersion: string = '2024-07-01'): Promise<T> {
return await armRequest<T>({
host: configContext.ARM_ENDPOINT,
path: path,
method: "PUT",
apiVersion: apiVersion,
body: request
});
}

View File

@@ -0,0 +1,12 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Localization utilities for CloudShell
*/
/**
* Gets the current locale for API requests
*/
export const getLocale = (): string => {
const langLocale = navigator.language;
return (langLocale && langLocale.length > 2 ? langLocale : 'en-us');
};

View File

@@ -0,0 +1,37 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Region utilities for CloudShell
*/
export const getLocale = () => {
const langLocale = navigator.language;
return (langLocale && langLocale.length === 2 ? langLocale[1] : 'en-us');
};
const validCloudShellRegions = new Set([
"westus",
"southcentralus",
"eastus",
"northeurope",
"westeurope",
"centralindia",
"southeastasia",
"westcentralus"
]);
/**
* Normalizes a region name to a valid CloudShell region
* @param region The region to normalize
* @param defaultCloudshellRegion Default region to use if the provided region is not supported
*/
export const getNormalizedRegion = (region: string, defaultCloudshellRegion: string) => {
if (!region) return defaultCloudshellRegion;
const regionMap: Record<string, string> = {
"centralus": "westcentralus",
"eastus2": "eastus"
};
const normalizedRegion = regionMap[region.toLowerCase()] || region;
return validCloudShellRegions.has(normalizedRegion.toLowerCase()) ? normalizedRegion : defaultCloudshellRegion;
};

View File

@@ -0,0 +1,185 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
*/
import { TerminalKind } from "../../../Contracts/ViewModels";
export const enum OsType {
Linux = "linux",
Windows = "windows"
}
export const enum ShellType {
Bash = "bash",
PowerShellCore = "pwsh"
}
export const enum NetworkType {
Default = "Default",
Isolated = "Isolated"
}
export const enum SessionType {
Mounted = "Mounted",
Ephemeral = "Ephemeral"
}
export const enum UserInputs {
NoReset = "1",
ConfigureVNet = "2",
ResetVNet = "3"
};
export type Settings = {
properties: UserSettingProperties
};
export type UserSettingProperties = {
networkType: string;
preferredLocation: string;
preferredOsType: OsType;
preferredShellType: ShellType;
userSubscription: string;
sessionType: SessionType;
vnetSettings: VnetSettings;
}
export type VnetSettings = {
networkProfileResourceId?: string;
relayNamespaceResourceId?: string;
location?: string;
}
export type ProvisionConsoleResponse = {
properties: {
osType: OsType;
provisioningState: string;
uri: string;
};
};
export type Authorization = {
token: string;
};
export type ConnectTerminalResponse = {
id: string;
idleTimeout: string;
rootDirectory: string;
socketUri: string;
tokenUpdated: boolean;
};
export type VnetModel = {
name: string;
id: string;
etag: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
provisioningState: string;
resourceGuid: string;
addressSpace: {
addressPrefixes: string[];
};
encryption: {
enabled: boolean;
enforcement: string;
};
privateEndpointVNetPolicies: string;
subnets: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
addressPrefixes?: string[];
addressPrefix?: string;
networkSecurityGroup?: { id: string };
ipConfigurations?: { id: string }[];
ipConfigurationProfiles?: { id: string }[];
privateEndpoints?: { id: string }[];
serviceEndpoints?: Array<{
provisioningState: string;
service: string;
locations: string[];
}>;
delegations?: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
serviceName: string;
actions: string[];
};
}>;
purpose?: string;
privateEndpointNetworkPolicies?: string;
privateLinkServiceNetworkPolicies?: string;
};
}>;
virtualNetworkPeerings: any[];
enableDdosProtection: boolean;
};
};
export type RelayNamespace = {
id: string;
name: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
metricId: string;
serviceBusEndpoint: string;
provisioningState: string;
status: string;
createdAt: string;
updatedAt: string;
};
sku: {
name: string;
tier: string;
};
};
export type RelayNamespaceResponse = {
value: RelayNamespace[];
};
/**
* Resource types for API versioning
*/
export enum ResourceType {
NETWORK = "NETWORK",
DATABASE = "DATABASE",
VNET = "VNET",
SUBNET = "SUBNET",
RELAY = "RELAY",
ROLE = "ROLE"
}
// Type definition for API_VERSIONS configuration
export type ApiVersionsConfig = {
// Global default API version
DEFAULT: string;
// Resource-specific default API versions
RESOURCE_DEFAULTS: {
[key in ResourceType]: string;
};
// Shell-type specific configurations
SHELL_TYPES: {
[key in TerminalKind]?: {
// Resource-specific overrides for this shell type
[key in ResourceType]?: string;
};
};
};

View File

@@ -0,0 +1,29 @@
/**
* Standardized terminal logging functions for consistent formatting
*/
export const terminalLog = {
// Section headers
header: (message: string) => `\n\x1B[1;34m┌─ ${message} ${"─".repeat(Math.max(45 - message.length, 0))}\x1B[0m`,
subheader: (message: string) => `\x1B[1;36m├ ${message}\x1B[0m`,
sectionEnd: () => `\x1B[1;34m└${"─".repeat(50)}\x1B[0m\n`,
// Status messages
success: (message: string) => `\x1B[32m✓ ${message}\x1B[0m`,
warning: (message: string) => `\x1B[33m⚠ ${message}\x1B[0m`,
error: (message: string) => `\x1B[31m✗ ${message}\x1B[0m`,
info: (message: string) => `\x1B[34m${message}\x1B[0m`,
// Resource information
database: (message: string) => `\x1B[35m🔶 Database: ${message}\x1B[0m`,
vnet: (message: string) => `\x1B[36m🔷 Network: ${message}\x1B[0m`,
cloudshell: (message: string) => `\x1B[32m🔷 CloudShell: ${message}\x1B[0m`,
// Data formatting
item: (label: string, value: string) => `${label}: \x1B[32m${value}\x1B[0m`,
progress: (operation: string, status: string) => `\x1B[34m${operation}: \x1B[36m${status}\x1B[0m`,
// User interaction
prompt: (message: string) => `\x1B[1;37m${message}\x1B[0m`,
separator: () => `\x1B[30;1m${"─".repeat(50)}\x1B[0m`
};

View File

@@ -0,0 +1,74 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* API versions configuration for CloudShell
*/
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { ResourceType } from "./DataModels";
/**
* Configuration for API versions used by the CloudShell
*/
export type ApiVersionsConfig = {
DEFAULT: string;
RESOURCE_DEFAULTS: Record<ResourceType, string>;
SHELL_TYPES: Record<TerminalKind, Record<ResourceType, string>>;
}
/**
* Default API versions configuration
*/
export const DEFAULT_API_VERSIONS: ApiVersionsConfig = {
DEFAULT: '2024-07-01',
RESOURCE_DEFAULTS: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2022-10-01',
[ResourceType.ROLE]: '2022-04-01',
},
SHELL_TYPES: {
[TerminalKind.Mongo]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.VCoreMongo]: {
[ResourceType.DATABASE]: '2024-07-01',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Postgres]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Cassandra]: {
[ResourceType.DATABASE]: '2024-11-15',
[ResourceType.NETWORK]: '2024-07-01',
[ResourceType.VNET]: '2024-07-01',
[ResourceType.SUBNET]: '2024-07-01',
[ResourceType.RELAY]: '2024-01-01',
[ResourceType.ROLE]: '2022-04-01',
},
[TerminalKind.Default]: {
[ResourceType.DATABASE]: undefined,
[ResourceType.NETWORK]: undefined,
[ResourceType.VNET]: undefined,
[ResourceType.SUBNET]: undefined,
[ResourceType.RELAY]: undefined,
[ResourceType.ROLE]: undefined,
},
},
};

View File

@@ -0,0 +1,163 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Data models for CloudShell
*/
export const enum OsType {
Linux = "linux",
Windows = "windows"
}
export const enum ShellType {
Bash = "bash",
PowerShellCore = "pwsh"
}
export const enum NetworkType {
Default = "Default",
Isolated = "Isolated"
}
export const enum SessionType {
Mounted = "Mounted",
Ephemeral = "Ephemeral"
}
export const enum UserInputs {
NoReset = "1",
ConfigureVNet = "2",
ResetVNet = "3"
};
export type Settings = {
properties: UserSettingProperties
};
export type UserSettingProperties = {
networkType: string;
preferredLocation: string;
preferredOsType: OsType;
preferredShellType: ShellType;
userSubscription: string;
sessionType: SessionType;
vnetSettings: VnetSettings;
}
export type VnetSettings = {
networkProfileResourceId?: string;
relayNamespaceResourceId?: string;
location?: string;
}
export type ProvisionConsoleResponse = {
properties: {
osType: OsType;
provisioningState: string;
uri: string;
};
};
export type Authorization = {
token: string;
};
export type ConnectTerminalResponse = {
id: string;
idleTimeout: string;
rootDirectory: string;
socketUri: string;
tokenUpdated: boolean;
};
export type VnetModel = {
name: string;
id: string;
etag: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
provisioningState: string;
resourceGuid: string;
addressSpace: {
addressPrefixes: string[];
};
encryption: {
enabled: boolean;
enforcement: string;
};
privateEndpointVNetPolicies: string;
subnets: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
addressPrefixes?: string[];
addressPrefix?: string;
networkSecurityGroup?: { id: string };
ipConfigurations?: { id: string }[];
ipConfigurationProfiles?: { id: string }[];
privateEndpoints?: { id: string }[];
serviceEndpoints?: Array<{
provisioningState: string;
service: string;
locations: string[];
}>;
delegations?: Array<{
name: string;
id: string;
etag: string;
type: string;
properties: {
provisioningState: string;
serviceName: string;
actions: string[];
};
}>;
purpose?: string;
privateEndpointNetworkPolicies?: string;
privateLinkServiceNetworkPolicies?: string;
};
}>;
virtualNetworkPeerings: any[];
enableDdosProtection: boolean;
};
};
export type RelayNamespace = {
id: string;
name: string;
type: string;
location: string;
tags: Record<string, string>;
properties: {
metricId: string;
serviceBusEndpoint: string;
provisioningState: string;
status: string;
createdAt: string;
updatedAt: string;
};
sku: {
name: string;
tier: string;
};
};
export type RelayNamespaceResponse = {
value: RelayNamespace[];
};
/**
* Resource types for API versioning
*/
export enum ResourceType {
NETWORK = "NETWORK",
DATABASE = "DATABASE",
VNET = "VNET",
SUBNET = "SUBNET",
RELAY = "RELAY",
ROLE = "ROLE"
}

View File

@@ -0,0 +1,94 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Firewall handling functionality for CloudShell
*/
import { Terminal } from "xterm";
import { userContext } from "../../../../UserContext";
import { hasFirewallRestrictions } from "../../Shared/CheckFirewallRules";
import { getAccountDetails, updateDatabaseAccount } from "../Data/CloudShellApiClient";
import { askConfirmation } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
export class FirewallHandler {
/**
* Checks if firewall configuration is needed for CloudShell
*/
public static async checkFirewallConfiguration(terminal: Terminal): Promise<boolean> {
if (!hasFirewallRestrictions()) {
return false; // No firewall rules to configure
}
terminal.writeln(terminalLog.header("Database Firewall Configuration"));
terminal.writeln(terminalLog.warning("Your database has firewall restrictions enabled"));
terminal.writeln(terminalLog.warning("CloudShell might need access through these restrictions"));
const shouldConfigureFirewall = await askConfirmation(
terminal,
"Would you like to check and configure firewall settings?"
);
if (!shouldConfigureFirewall) {
terminal.writeln(terminalLog.info("Skipping firewall configuration"));
return false;
}
return await this.configureFirewallForCloudShell(terminal);
}
/**
* Configures firewall for CloudShell access
*/
private static async configureFirewallForCloudShell(terminal: Terminal): Promise<boolean> {
try {
// Get current database account details
terminal.writeln(terminalLog.database("Retrieving current firewall configuration..."));
const dbAccount = userContext.databaseAccount;
const currentDbAccount = await getAccountDetails(dbAccount.id);
// Check if "Allow Azure Services" is already enabled
const ipRules = currentDbAccount.properties.ipRules || [];
const azureServicesEnabled = currentDbAccount.properties.publicNetworkAccess === "Enabled";
if (azureServicesEnabled) {
terminal.writeln(terminalLog.success("Azure services access is already enabled"));
return true;
}
// Ask user to enable Azure services access
terminal.writeln(terminalLog.warning("Azure services access is not enabled"));
terminal.writeln(terminalLog.info("CloudShell requires 'Allow Azure Services' to be enabled"));
const enableAzureServices = await askConfirmation(
terminal,
"Enable 'Allow Azure Services' for this database?"
);
if (!enableAzureServices) {
terminal.writeln(terminalLog.warning("CloudShell may not be able to connect without enabling Azure services access"));
return false;
}
// Update database account to enable Azure services access
terminal.writeln(terminalLog.info("Updating database firewall configuration..."));
// Create update payload - only modify firewall-related properties
const updatePayload = {
...currentDbAccount,
properties: {
...currentDbAccount.properties,
publicNetworkAccess: "Enabled"
}
};
await updateDatabaseAccount(dbAccount.id, updatePayload);
terminal.writeln(terminalLog.success("Database firewall updated successfully"));
terminal.writeln(terminalLog.success("Azure services access is now enabled"));
return true;
} catch (error) {
terminal.writeln(terminalLog.error(`Error configuring firewall: ${error.message}`));
return false;
}
}
}

View File

@@ -0,0 +1,99 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Network access configuration handler for CloudShell
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { IsPublicAccessAvailable } from "../../Shared/CheckFirewallRules";
import { getUserSettings } from "../Data/CloudShellApiClient";
import { VnetSettings } from "../Models/DataModels";
import { terminalLog } from "../Utils/LogFormatter";
import { VNetHandler } from "./VNetHandler";
export class NetworkAccessHandler {
/**
* Configures network access for the CloudShell based on shell type and network restrictions
*/
public static async configureNetworkAccess(
terminal: Terminal,
region: string,
shellType: TerminalKind
): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
// Check if public access is available for this shell type
const isAllPublicAccessEnabled = await IsPublicAccessAvailable(shellType);
// If public access is enabled, no need for VNet configuration
if (isAllPublicAccessEnabled) {
terminal.writeln(terminalLog.database("Public access enabled. Skipping VNet configuration."));
return {
vNetSettings: {},
isAllPublicAccessEnabled: true
};
}
// Public access is restricted, we need to configure a VNet or use existing one
terminal.writeln(terminalLog.database("Network restrictions detected"));
terminal.writeln(terminalLog.info("Loading CloudShell configuration..."));
// Get existing settings if available
const settings = await getUserSettings();
if (!settings) {
terminal.writeln(terminalLog.warning("No existing user settings found."));
}
// Retrieve CloudShell VNet settings if available
let cloudShellVnetSettings: VnetSettings | undefined;
if (settings) {
cloudShellVnetSettings = await VNetHandler.retrieveCloudShellVnetSettings(settings, terminal);
}
// If CloudShell has VNet settings, check with database config
let finalVNetSettings = {};
if (cloudShellVnetSettings && cloudShellVnetSettings.networkProfileResourceId) {
// Check if we should use existing VNet settings
const isContinueWithSameVnet = await VNetHandler.askForVNetConfigConsent(terminal, shellType);
if (isContinueWithSameVnet) {
// Check if the VNet is already configured in the database
const isVNetInDatabaseConfig = await VNetHandler.isCloudShellVNetInDatabaseConfig(cloudShellVnetSettings, terminal);
if (!isVNetInDatabaseConfig) {
terminal.writeln(terminalLog.warning("CloudShell VNet is not configured in database access list"));
const addToDatabase = await VNetHandler.askToAddVNetToDatabase(terminal, cloudShellVnetSettings);
if (addToDatabase) {
await VNetHandler.addCloudShellVNetToDatabase(cloudShellVnetSettings, terminal);
finalVNetSettings = cloudShellVnetSettings;
} else {
// User declined to add VNet to database, need to recreate
terminal.writeln(terminalLog.warning("Will configure new VNet..."));
cloudShellVnetSettings = undefined;
}
} else {
terminal.writeln(terminalLog.success("CloudShell VNet is already in database configuration"));
finalVNetSettings = cloudShellVnetSettings;
}
} else {
cloudShellVnetSettings = undefined; // User declined to use existing VNet settings
}
}
// If we don't have valid VNet settings, create new ones
if (!cloudShellVnetSettings || !cloudShellVnetSettings.networkProfileResourceId) {
terminal.writeln(terminalLog.subheader("Configuring network infrastructure"));
finalVNetSettings = await VNetHandler.configureCloudShellVNet(terminal, region);
// Add the new VNet to the database
await VNetHandler.addCloudShellVNetToDatabase(finalVNetSettings as VnetSettings, terminal);
}
return {
vNetSettings: finalVNetSettings,
isAllPublicAccessEnabled: false
};
}
}

View File

@@ -0,0 +1,894 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* VNet handling functionality for CloudShell
*/
import { v4 as uuidv4 } from 'uuid';
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { hasPrivateEndpointsRestrictions } from "../../Shared/CheckFirewallRules";
import {
createNetworkProfile,
createPrivateEndpoint,
createRelay,
createRoleOnNetworkProfile,
createRoleOnRelay,
getAccountDetails,
getDatabaseOperations,
getNetworkProfileInfo,
getRelay,
getSubnetInformation,
getVnet,
getVnetInformation,
updateDatabaseAccount,
updateSubnetInformation,
updateVnet
} from "../Data/CloudShellApiClient";
import { Settings, VnetSettings } from "../Models/DataModels";
import { askConfirmation, askQuestion, wait } from "../Utils/CommonUtils";
import { terminalLog } from "../Utils/LogFormatter";
// Constants for VNet configuration
const POLLING_INTERVAL_MS = 5000;
const MAX_RETRY_COUNT = 10;
const STANDARD_SKU = "Standard";
const DEFAULT_VNET_ADDRESS_PREFIX = "10.0.0.0/16";
const DEFAULT_SUBNET_ADDRESS_PREFIX = "10.0.1.0/24";
const DEFAULT_CONTAINER_INSTANCE_OID = "88536fb9-d60a-4aee-8195-041425d6e927";
export class VNetHandler {
/**
* Retrieves CloudShell VNet settings from user settings
*/
public static async retrieveCloudShellVnetSettings(settings: Settings, terminal: Terminal): Promise<VnetSettings> {
if (settings?.properties?.vnetSettings && Object.keys(settings.properties.vnetSettings).length > 0) {
try {
const netProfileInfo = await getNetworkProfileInfo<any>(settings.properties.vnetSettings.networkProfileResourceId);
terminal.writeln(terminalLog.header("Existing Network Configuration"));
const subnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
const vnetResourceId = subnetId.replace(/\/subnets\/[^/]+$/, '');
terminal.writeln(terminalLog.item("VNet", vnetResourceId));
terminal.writeln(terminalLog.item("Subnet", subnetId));
terminal.writeln(terminalLog.item("Location", settings.properties.vnetSettings.location));
terminal.writeln(terminalLog.item("Network Profile", settings.properties.vnetSettings.networkProfileResourceId));
terminal.writeln(terminalLog.item("Relay Namespace", settings.properties.vnetSettings.relayNamespaceResourceId));
return {
networkProfileResourceId: settings.properties.vnetSettings.networkProfileResourceId,
relayNamespaceResourceId: settings.properties.vnetSettings.relayNamespaceResourceId,
location: settings.properties.vnetSettings.location
};
} catch (err) {
terminal.writeln(terminalLog.warning("Error retrieving network profile. Will configure new network."));
return undefined;
}
}
return undefined;
}
/**
* Asks the user if they want to use existing network configuration (VNet or private endpoint)
*/
public static async askForVNetConfigConsent(terminal: Terminal, shellType: TerminalKind = null): Promise<boolean> {
// Check if this shell type supports only private endpoints
const isPrivateEndpointOnlyShell = shellType === TerminalKind.VCoreMongo;
// Check if the database has private endpoints configured
const hasPrivateEndpoints = hasPrivateEndpointsRestrictions();
// Determine which network type to mention based on shell type and database configuration
const networkType = isPrivateEndpointOnlyShell || hasPrivateEndpoints ? "private endpoint" : "network";
// Ask for consent
terminal.writeln("");
terminal.writeln(terminalLog.prompt(`Use this existing ${networkType} configuration?`));
terminal.writeln(terminalLog.info(`Answering 'N' will configure a new ${networkType} for CloudShell`));
return await askConfirmation(terminal, `Press Y/N to continue...`);
}
/**
* Checks if the CloudShell VNet is already in the database configuration
*/
public static async isCloudShellVNetInDatabaseConfig(vNetSettings: VnetSettings, terminal: Terminal): Promise<boolean> {
try {
terminal.writeln(terminalLog.subheader("Verifying if CloudShell VNet is configured in database"));
// Get the subnet ID from the CloudShell Network Profile
const netProfileInfo = await getNetworkProfileInfo<any>(vNetSettings.networkProfileResourceId);
if (!netProfileInfo?.properties?.containerNetworkInterfaceConfigurations?.[0]
?.properties?.ipConfigurations?.[0]?.properties?.subnet?.id) {
terminal.writeln(terminalLog.warning("Could not retrieve subnet ID from CloudShell VNet"));
return false;
}
const cloudShellSubnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
terminal.writeln(terminalLog.item("CloudShell Subnet", cloudShellSubnetId.split('/').pop() || ""));
// Check if this subnet ID is in the database VNet rules
const dbAccount = userContext.databaseAccount;
if (!dbAccount?.properties?.virtualNetworkRules) {
return false;
}
const vnetRules = dbAccount.properties.virtualNetworkRules;
// Check if the CloudShell subnet is already in the rules
return vnetRules.some(rule => rule.id === cloudShellSubnetId);
} catch (err) {
terminal.writeln(terminalLog.error("Error checking database VNet configuration"));
return false;
}
}
/**
* Asks the user if they want to add the CloudShell VNet to the database configuration
*/
public static async askToAddVNetToDatabase(terminal: Terminal, vNetSettings: VnetSettings): Promise<boolean> {
terminal.writeln("");
terminal.writeln(terminalLog.header("Network Configuration Mismatch"));
terminal.writeln(terminalLog.warning("Your CloudShell VNet is not in your database's allowed networks"));
terminal.writeln(terminalLog.warning("To connect from CloudShell, this VNet must be added to your database"));
return await askConfirmation(terminal, "Add CloudShell VNet to database configuration?");
}
/**
* Adds the CloudShell VNet to the database configuration
* Now supports both VNet rules and private endpoints
*/
public static async addCloudShellVNetToDatabase(vNetSettings: VnetSettings, terminal: Terminal): Promise<void> {
try {
terminal.writeln(terminalLog.header("Updating database network configuration"));
// Step 1: Get the subnet ID from CloudShell Network Profile
const { cloudShellSubnetId, cloudShellVnetId } = await this.getCloudShellNetworkIds(vNetSettings, terminal);
// Step 2: Get current database account details
const { currentDbAccount } = await this.getDatabaseAccountDetails(terminal);
// Step 3: Determine if database uses private endpoints
const usesPrivateEndpoints = hasPrivateEndpointsRestrictions() ||
(currentDbAccount.properties.privateEndpointConnections?.length > 0);
// Log which networking mode we're using
if (usesPrivateEndpoints) {
terminal.writeln(terminalLog.info("Database is configured with private endpoints"));
} else {
terminal.writeln(terminalLog.info("Database is configured with VNet rules"));
}
// Step 4: Check if connection is already configured
if (usesPrivateEndpoints) {
if (await this.isPrivateEndpointAlreadyConfigured(cloudShellVnetId, currentDbAccount, terminal)) {
return;
}
} else {
if (await this.isVNetAlreadyConfigured(cloudShellSubnetId, currentDbAccount, terminal)) {
return;
}
}
// Step 5: Check network resource statuses and ongoing operations
const { vnetInfo, subnetInfo, operationInProgress } =
await this.checkNetworkResourceStatuses(cloudShellSubnetId, cloudShellVnetId, currentDbAccount.id, terminal);
// Step 6: If no operation in progress, update the configuration
if (!operationInProgress) {
if (usesPrivateEndpoints) {
// Create or update private endpoint configuration
await this.configurePrivateEndpoint(
cloudShellSubnetId,
vnetInfo.location,
currentDbAccount.id,
terminal
);
} else {
// Enable CosmosDB service endpoint on subnet if needed (for VNet rules)
await this.enableCosmosDBServiceEndpoint(cloudShellSubnetId, subnetInfo, terminal);
// Update database account with VNet rule
await this.updateDatabaseWithVNetRule(currentDbAccount, cloudShellSubnetId, currentDbAccount.id, terminal);
}
} else {
terminal.writeln(terminalLog.info("Monitoring existing network operation..."));
// Step 7: Monitor the update progress
await this.monitorVNetAdditionProgress(cloudShellSubnetId, currentDbAccount.id, terminal);
}
} catch (err) {
terminal.writeln(terminalLog.error(`Error updating database network configuration: ${err.message}`));
throw err;
}
}
/**
* Checks if a private endpoint is already configured for the CloudShell VNet
*/
private static async isPrivateEndpointAlreadyConfigured(
cloudShellVnetId: string,
currentDbAccount: any,
terminal: Terminal
): Promise<boolean> {
// Check if private endpoints exist and are properly configured for this VNet
const hasConfiguredEndpoint = currentDbAccount.properties.privateEndpointConnections?.some(
(connection: any) => {
const isApproved = connection.properties.privateLinkServiceConnectionState.status === 'Approved';
// We would need to check if the endpoint is in the CloudShell VNet
// For simplicity, we're assuming connection.properties.networkInterface contains this info
const endpointVNetId = connection.properties.networkInterface?.id?.split('/subnets/')[0];
return isApproved && endpointVNetId === cloudShellVnetId;
}
);
if (hasConfiguredEndpoint) {
terminal.writeln(terminalLog.success("CloudShell private endpoint is already configured"));
return true;
}
return false;
}
/**
* Configures a private endpoint for the CloudShell VNet to connect to the database
*/
private static async configurePrivateEndpoint(
cloudShellSubnetId: string,
vnetLocation: any,
dbAccountId: string,
terminal: Terminal
): Promise<void> {
// Extract necessary information from IDs
const subnetIdParts = cloudShellSubnetId.split('/');
const subnetIndex = subnetIdParts.indexOf('subnets');
const subnetName = subnetIdParts[subnetIndex + 1];
const resourceGroup = subnetIdParts[4];
const subscriptionId = subnetIdParts[2];
// Generate a unique name for the private endpoint
const privateEndpointName = `pe-cloudshell-cosmos-${Math.floor(10000 + Math.random() * 90000)}`;
terminal.writeln(terminalLog.subheader("Creating private endpoint for CloudShell"));
terminal.writeln(terminalLog.item("Private Endpoint Name", privateEndpointName));
terminal.writeln(terminalLog.item("Target Subnet", subnetName));
// Construct the private endpoint creation payload
const privateEndpointPayload = {
location: vnetLocation,
properties: {
privateLinkServiceConnections: [
{
name: privateEndpointName,
properties: {
privateLinkServiceId: dbAccountId,
groupIds: [
"MongoDB"
],
requestMessage: "CloudShell connectivity request"
},
type: "Microsoft.Network/privateEndpoints/privateLinkServiceConnections"
}
],
subnet: {
id: cloudShellSubnetId
}
}
};
// Send the request to create the private endpoint
// Note: This is a placeholder - we would need to implement this API call
terminal.writeln(terminalLog.info("Submitting private endpoint creation request"));
try {
const privateEndpointUrl = `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.Network/privateEndpoints/${privateEndpointName}`;
await createPrivateEndpoint(privateEndpointUrl, privateEndpointPayload, "2024-05-01");
terminal.writeln(terminalLog.success("Private endpoint creation request submitted"));
terminal.writeln(terminalLog.warning("Please approve the private endpoint connection in the Azure portal"));
terminal.writeln(terminalLog.info("Note: Private endpoint operations may take several minutes to complete"));
} catch (err) {
terminal.writeln(terminalLog.error(`Failed to create private endpoint: ${err.message}`));
throw err;
}
}
/**
* Gets the subnet and VNet IDs from CloudShell Network Profile
*/
private static async getCloudShellNetworkIds(vNetSettings: VnetSettings, terminal: Terminal): Promise<{ cloudShellSubnetId: string; cloudShellVnetId: string }> {
const netProfileInfo = await getNetworkProfileInfo<any>(vNetSettings.networkProfileResourceId);
if (!netProfileInfo?.properties?.containerNetworkInterfaceConfigurations?.[0]
?.properties?.ipConfigurations?.[0]?.properties?.subnet?.id) {
throw new Error("Could not retrieve subnet ID from CloudShell VNet");
}
const cloudShellSubnetId = netProfileInfo.properties.containerNetworkInterfaceConfigurations[0]
.properties.ipConfigurations[0].properties.subnet.id;
// Extract VNet ID from subnet ID
const cloudShellVnetId = cloudShellSubnetId.substring(0, cloudShellSubnetId.indexOf('/subnets/'));
terminal.writeln(terminalLog.subheader("Identified CloudShell network resources"));
terminal.writeln(terminalLog.item("Subnet", cloudShellSubnetId.split('/').pop() || ""));
terminal.writeln(terminalLog.item("VNet", cloudShellVnetId.split('/').pop() || ""));
return { cloudShellSubnetId, cloudShellVnetId };
}
/**
* Gets the database account details
*/
private static async getDatabaseAccountDetails(terminal: Terminal): Promise<{ currentDbAccount: any }> {
const dbAccount = userContext.databaseAccount;
terminal.writeln(terminalLog.database("Verifying current configuration"));
const currentDbAccount = await getAccountDetails(dbAccount.id);
return { currentDbAccount };
}
/**
* Checks if the VNet is already configured in the database
*/
private static async isVNetAlreadyConfigured(cloudShellSubnetId: string, currentDbAccount: any, terminal: Terminal): Promise<boolean> {
const vnetAlreadyConfigured = currentDbAccount.properties.virtualNetworkRules &&
currentDbAccount.properties.virtualNetworkRules.some(
(rule: any) => rule.id === cloudShellSubnetId
);
if (vnetAlreadyConfigured) {
terminal.writeln(terminalLog.success("CloudShell VNet is already in database configuration"));
return true;
}
return false;
}
/**
* Checks the status of network resources and ongoing operations
*/
private static async checkNetworkResourceStatuses(
cloudShellSubnetId: string,
cloudShellVnetId: string,
dbAccountId: string,
terminal: Terminal
): Promise<{ vnetInfo: any; subnetInfo: any; operationInProgress: boolean }> {
terminal.writeln(terminalLog.subheader("Checking network resource status"));
let operationInProgress = false;
let vnetInfo: any = null;
let subnetInfo: any = null;
if (cloudShellVnetId && cloudShellSubnetId) {
// Get VNet and subnet resource status
vnetInfo = await getVnetInformation<any>(cloudShellVnetId);
subnetInfo = await getSubnetInformation<any>(cloudShellSubnetId);
// Check if there's an ongoing operation on the VNet or subnet
const vnetProvisioningState = vnetInfo?.properties?.provisioningState;
const subnetProvisioningState = subnetInfo?.properties?.provisioningState;
if (vnetProvisioningState !== 'Succeeded' && vnetProvisioningState !== 'Failed') {
terminal.writeln(terminalLog.warning(`VNet operation in progress: ${vnetProvisioningState}`));
operationInProgress = true;
}
if (subnetProvisioningState !== 'Succeeded' && subnetProvisioningState !== 'Failed') {
terminal.writeln(terminalLog.warning(`Subnet operation in progress: ${subnetProvisioningState}`));
operationInProgress = true;
}
// Also check database operations
const latestDbAccount = await getAccountDetails<any>(dbAccountId);
if (latestDbAccount.properties.virtualNetworkRules) {
const isPendingAdd = latestDbAccount.properties.virtualNetworkRules.some(
(rule: any) => rule.id === cloudShellSubnetId && rule.status === 'Updating'
);
if (isPendingAdd) {
terminal.writeln(terminalLog.warning("CloudShell VNet addition to database is already in progress"));
operationInProgress = true;
}
}
}
return { vnetInfo, subnetInfo, operationInProgress };
}
/**
* Enables the CosmosDB service endpoint on a subnet if needed
*/
private static async enableCosmosDBServiceEndpoint(cloudShellSubnetId: string, subnetInfo: any, terminal: Terminal): Promise<void> {
if (!subnetInfo) {
terminal.writeln(terminalLog.warning("Unable to check subnet endpoint configuration"));
return;
}
terminal.writeln(terminalLog.subheader("Checking and configuring CosmosDB service endpoint"));
// Parse the subnet ID to get resource information
const subnetIdParts = cloudShellSubnetId.split('/');
const subnetIndex = subnetIdParts.indexOf('subnets');
if (subnetIndex > 0) {
const subnetName = subnetIdParts[subnetIndex + 1];
const vnetName = subnetIdParts[subnetIndex - 1];
const resourceGroup = subnetIdParts[4];
const subscriptionId = subnetIdParts[2];
// Get the subnet URL
const subnetUrl = `/subscriptions/${subscriptionId}/resourceGroups/${resourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}/subnets/${subnetName}`;
// Check if CosmosDB service endpoint is already enabled
const hasCosmosDBEndpoint = subnetInfo.properties.serviceEndpoints &&
subnetInfo.properties.serviceEndpoints.some(
(endpoint: any) => endpoint.service === 'Microsoft.AzureCosmosDB'
);
if (!hasCosmosDBEndpoint) {
terminal.writeln(terminalLog.warning("Enabling CosmosDB service endpoint on subnet..."));
// Create update payload with CosmosDB service endpoint
const serviceEndpoints = [
...(subnetInfo.properties.serviceEndpoints || []),
{ service: 'Microsoft.AzureCosmosDB' }
];
// Update the subnet configuration while preserving existing properties
const subnetUpdatePayload = {
...subnetInfo,
properties: {
...subnetInfo.properties,
serviceEndpoints: serviceEndpoints
}
};
// Apply the subnet update
await updateSubnetInformation(subnetUrl, subnetUpdatePayload);
// Wait for the subnet update to complete
let subnetUpdateComplete = false;
let subnetRetryCount = 0;
while (!subnetUpdateComplete && subnetRetryCount < MAX_RETRY_COUNT) {
const updatedSubnet = await getSubnetInformation<any>(subnetUrl);
const endpointEnabled = updatedSubnet.properties.serviceEndpoints &&
updatedSubnet.properties.serviceEndpoints.some(
(endpoint: any) => endpoint.service === 'Microsoft.AzureCosmosDB'
);
if (endpointEnabled && updatedSubnet.properties.provisioningState === 'Succeeded') {
subnetUpdateComplete = true;
terminal.writeln(terminalLog.success("CosmosDB service endpoint enabled successfully"));
} else {
subnetRetryCount++;
terminal.writeln(terminalLog.progress("Subnet update", `Waiting (${subnetRetryCount}/${MAX_RETRY_COUNT})`));
await wait(POLLING_INTERVAL_MS);
}
}
if (!subnetUpdateComplete) {
throw new Error("Failed to enable CosmosDB service endpoint on subnet");
}
} else {
terminal.writeln(terminalLog.success("CosmosDB service endpoint is already enabled"));
}
}
}
/**
* Updates the database account with a new VNet rule
*/
private static async updateDatabaseWithVNetRule(currentDbAccount: any, cloudShellSubnetId: string, dbAccountId: string, terminal: Terminal): Promise<void> {
// Create a deep copy of the current database account
const updatePayload = JSON.parse(JSON.stringify(currentDbAccount));
// Update only the network-related properties
updatePayload.properties.virtualNetworkRules = [
...(currentDbAccount.properties.virtualNetworkRules || []),
{ id: cloudShellSubnetId, ignoreMissingVNetServiceEndpoint: false }
];
updatePayload.properties.isVirtualNetworkFilterEnabled = true;
// Update the database account
terminal.writeln(terminalLog.subheader("Submitting VNet update request to database"));
await updateDatabaseAccount(dbAccountId, updatePayload);
terminal.writeln(terminalLog.success("Updated Database account with Cloud Shell Vnet"));
}
/**
* Monitors the progress of adding a VNet to the database account
*/
private static async monitorVNetAdditionProgress(cloudShellSubnetId: string, dbAccountId: string, terminal: Terminal): Promise<void> {
let updateComplete = false;
let retryCount = 0;
let lastStatus = "";
let lastProgress = 0;
let lastOpId = "";
terminal.writeln(terminalLog.subheader("Monitoring database update progress"));
while (!updateComplete && retryCount < MAX_RETRY_COUNT) {
// Check if the VNet is now in the database account
const updatedDbAccount = await getAccountDetails<any>(dbAccountId);
const isVNetAdded = updatedDbAccount.properties.virtualNetworkRules?.some(
(rule: any) => rule.id === cloudShellSubnetId && (!rule.status || rule.status === 'Succeeded')
);
if (isVNetAdded) {
updateComplete = true;
terminal.writeln(terminalLog.success("CloudShell VNet successfully added to database configuration"));
break;
}
// If not yet added, check for operation progress
const operations = await getDatabaseOperations<any>(dbAccountId);
// Find network-related operations
const networkOps = operations.value?.filter(
(op: any) =>
(op.properties.description?.toLowerCase().includes('network') ||
op.properties.description?.toLowerCase().includes('vnet'))
) || [];
// Find active operations
const activeOp = networkOps.find((op: any) => op.properties.status === 'InProgress');
if (activeOp) {
// Show progress details if available
const currentStatus = activeOp.properties.status;
const progress = activeOp.properties.percentComplete || 0;
const opId = activeOp.name;
// Only update the terminal if something has changed
if (currentStatus !== lastStatus || progress !== lastProgress || opId !== lastOpId) {
// Create a progress bar
const progressBarLength = 20;
const filledLength = Math.floor(progress / 100 * progressBarLength);
const progressBar = "█".repeat(filledLength) + "░".repeat(progressBarLength - filledLength);
terminal.writeln(`\x1B[34m [${progressBar}] ${progress}% - ${currentStatus}\x1B[0m`);
lastStatus = currentStatus;
lastProgress = progress;
lastOpId = opId;
}
} else if (networkOps.length > 0) {
// If there are completed operations, show their status
const lastCompletedOp = networkOps[0];
if (lastCompletedOp.properties.status !== lastStatus) {
terminal.writeln(terminalLog.progress("Operation status", lastCompletedOp.properties.status));
lastStatus = lastCompletedOp.properties.status;
}
}
retryCount++;
await wait(POLLING_INTERVAL_MS);
}
if (!updateComplete) {
terminal.writeln(terminalLog.warning("Database update timed out. Please check the Azure portal."));
}
}
/**
* Configures a new VNet for CloudShell
*/
public static async configureCloudShellVNet(terminal: Terminal, resolvedRegion: string): Promise<VnetSettings> {
// Use professional and shorter names for resources
const randomSuffix = Math.floor(10000 + Math.random() * 90000);
const subnetName = `cloudshell-subnet-${randomSuffix}`;
const vnetName = `cloudshell-vnet-${randomSuffix}`;
const networkProfileName = `cloudshell-network-profile-${randomSuffix}`;
const relayName = `cloudshell-relay-${randomSuffix}`;
terminal.writeln(terminalLog.header("Network Resource Configuration"));
const azureContainerInstanceOID = await askQuestion(
terminal,
"Enter Azure Container Instance OID (Refer. https://learn.microsoft.com/en-us/azure/cloud-shell/vnet/deployment#get-the-azure-container-instance-id)",
DEFAULT_CONTAINER_INSTANCE_OID
);
const vNetSubscriptionId = await askQuestion(
terminal,
"Enter Virtual Network Subscription ID",
userContext.subscriptionId
);
const vNetResourceGroup = await askQuestion(
terminal,
"Enter Virtual Network Resource Group",
userContext.resourceGroup
);
// Step 1: Create VNet with Subnet
terminal.writeln(terminalLog.header("Deploying Network Resources"));
const vNetConfigPayload = await this.createCloudShellVnet(
resolvedRegion,
subnetName,
terminal,
vnetName,
vNetSubscriptionId,
vNetResourceGroup
);
// Step 2: Create Network Profile
await this.createNetworkProfileWithVnet(
vNetSubscriptionId,
vNetResourceGroup,
vnetName,
subnetName,
resolvedRegion,
terminal,
networkProfileName
);
// Step 3: Create Network Relay
await this.createNetworkRelay(
resolvedRegion,
terminal,
relayName,
vNetSubscriptionId,
vNetResourceGroup
);
// Step 4: Assign Roles
terminal.writeln(terminalLog.header("Configuring Security Permissions"));
await this.assignRoleToNetworkProfile(
azureContainerInstanceOID,
vNetSubscriptionId,
terminal,
networkProfileName,
vNetResourceGroup
);
await this.assignRoleToRelay(
azureContainerInstanceOID,
vNetSubscriptionId,
terminal,
relayName,
vNetResourceGroup
);
// Step 5: Create and return VNet settings
const networkProfileResourceId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName.replace(/[\n\r]/g, "")}`;
const relayResourceId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName.replace(/[\n\r]/g, "")}`;
terminal.writeln(terminalLog.success("Network configuration complete"));
return {
networkProfileResourceId,
relayNamespaceResourceId: relayResourceId,
location: vNetConfigPayload.location
};
}
/**
* Creates a VNet for CloudShell
*/
private static async createCloudShellVnet(
resolvedRegion: string,
subnetName: string,
terminal: Terminal,
vnetName: string,
vNetSubscriptionId: string,
vNetResourceGroup: string
): Promise<any> {
const vNetConfigPayload = {
location: resolvedRegion,
properties: {
addressSpace: {
addressPrefixes: [DEFAULT_VNET_ADDRESS_PREFIX],
},
subnets: [
{
name: subnetName,
properties: {
addressPrefix: DEFAULT_SUBNET_ADDRESS_PREFIX,
delegations: [
{
name: "CloudShellDelegation",
properties: {
serviceName: "Microsoft.ContainerInstance/containerGroups"
}
}
],
},
},
],
},
};
terminal.writeln(terminalLog.vnet(`Creating VNet: ${vnetName}`));
let vNetResponse = await updateVnet<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}`,
vNetConfigPayload
);
while (vNetResponse?.properties?.provisioningState !== "Succeeded") {
vNetResponse = await getVnet<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}`
);
const vNetState = vNetResponse?.properties?.provisioningState;
if (vNetState !== "Succeeded" && vNetState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("VNet deployment", vNetState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("VNet created successfully"));
return vNetConfigPayload;
}
/**
* Creates a Network Profile for CloudShell
*/
private static async createNetworkProfileWithVnet(
vNetSubscriptionId: string,
vNetResourceGroup: string,
vnetName: string,
subnetName: string,
resolvedRegion: string,
terminal: Terminal,
networkProfileName: string
): Promise<void> {
const subnetId = `/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/virtualNetworks/${vnetName}/subnets/${subnetName}`;
const createNetworkProfilePayload = {
location: resolvedRegion,
properties: {
containerNetworkInterfaceConfigurations: [
{
name: 'defaultContainerNicConfig',
properties: {
ipConfigurations: [
{
name: 'defaultContainerIpConfig',
properties: {
subnet: {
id: subnetId,
}
}
}
]
}
}
]
}
};
terminal.writeln(terminalLog.vnet("Creating Network Profile"));
let networkProfileResponse = await createNetworkProfile<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}`,
createNetworkProfilePayload
);
while (networkProfileResponse?.properties?.provisioningState !== "Succeeded") {
networkProfileResponse = await getNetworkProfileInfo<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}`
);
const networkProfileState = networkProfileResponse?.properties?.provisioningState;
if (networkProfileState !== "Succeeded" && networkProfileState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("Network Profile", networkProfileState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("Network Profile created successfully"));
}
/**
* Creates a Network Relay for CloudShell
*/
private static async createNetworkRelay(
resolvedRegion: string,
terminal: Terminal,
relayName: string,
vNetSubscriptionId: string,
vNetResourceGroup: string
): Promise<void> {
const relayPayload = {
location: resolvedRegion,
sku: {
name: STANDARD_SKU,
tier: STANDARD_SKU,
}
};
terminal.writeln(terminalLog.vnet("Creating Relay Namespace"));
let relayResponse = await createRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}`,
relayPayload
);
while (relayResponse?.properties?.provisioningState !== "Succeeded") {
relayResponse = await getRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}`
);
const relayState = relayResponse?.properties?.provisioningState;
if (relayState !== "Succeeded" && relayState !== "Failed") {
await wait(POLLING_INTERVAL_MS);
terminal.writeln(terminalLog.progress("Relay Namespace", relayState));
} else {
break;
}
}
terminal.writeln(terminalLog.success("Relay Namespace created successfully"));
}
/**
* Assigns a role to a Network Profile
*/
private static async assignRoleToNetworkProfile(
azureContainerInstanceOID: string,
vNetSubscriptionId: string,
terminal: Terminal,
networkProfileName: string,
vNetResourceGroup: string
): Promise<void> {
const nfRoleName = uuidv4();
const networkProfileRoleAssignmentPayload = {
properties: {
principalId: azureContainerInstanceOID,
roleDefinitionId: `/subscriptions/${vNetSubscriptionId}/providers/Microsoft.Authorization/roleDefinitions/4d97b98b-1d4f-4787-a291-c67834d212e7`
}
};
terminal.writeln(terminalLog.info("Assigning permissions to Network Profile"));
await createRoleOnNetworkProfile<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Network/networkProfiles/${networkProfileName}/providers/Microsoft.Authorization/roleAssignments/${nfRoleName}`,
networkProfileRoleAssignmentPayload
);
terminal.writeln(terminalLog.success("Network Profile permissions assigned"));
}
/**
* Assigns a role to a Network Relay
*/
private static async assignRoleToRelay(
azureContainerInstanceOID: string,
vNetSubscriptionId: string,
terminal: Terminal,
relayName: string,
vNetResourceGroup: string
): Promise<void> {
const relayRoleName = uuidv4();
const relayRoleAssignmentPayload = {
properties: {
principalId: azureContainerInstanceOID,
roleDefinitionId: `/subscriptions/${vNetSubscriptionId}/providers/Microsoft.Authorization/roleDefinitions/b24988ac-6180-42a0-ab88-20f7382dd24c`,
}
};
terminal.writeln(terminalLog.info("Assigning permissions to Relay Namespace"));
await createRoleOnRelay<any>(
`/subscriptions/${vNetSubscriptionId}/resourceGroups/${vNetResourceGroup}/providers/Microsoft.Relay/namespaces/${relayName}/providers/Microsoft.Authorization/roleAssignments/${relayRoleName}`,
relayRoleAssignmentPayload
);
terminal.writeln(terminalLog.success("Relay Namespace permissions assigned"));
}
}

View File

@@ -0,0 +1,80 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Cassandra shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class CassandraShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Cassandra;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "Cassandra";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.cassandraEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if cqlsh is installed; if not, proceed with installation
"if ! command -v cqlsh &> /dev/null; then echo '⚠️ cqlsh not found. Installing...'; fi",
// 3. Download Cassandra if not installed
"if ! command -v cqlsh &> /dev/null; then curl -LO https://archive.apache.org/dist/cassandra/5.0.3/apache-cassandra-5.0.3-bin.tar.gz; fi",
// 4. Extract Cassandra package if not installed
"if ! command -v cqlsh &> /dev/null; then tar -xvzf apache-cassandra-5.0.3-bin.tar.gz; fi",
// 5. Move Cassandra binaries if not installed
"if ! command -v cqlsh &> /dev/null; then mkdir -p ~/cassandra && mv apache-cassandra-5.0.3/* ~/cassandra/; fi",
// 6. Add Cassandra to PATH if not installed
"if ! command -v cqlsh &> /dev/null; then echo 'export PATH=$HOME/cassandra/bin:$PATH' >> ~/.bashrc; fi",
// 7. Set environment variables for SSL
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VERSION=TLSv1_2' >> ~/.bashrc; fi",
"if ! command -v cqlsh &> /dev/null; then echo 'export SSL_VALIDATE=false' >> ~/.bashrc; fi",
// 8. Source .bashrc to update PATH (even if cqlsh was already installed)
"source ~/.bashrc",
// 9. Verify cqlsh installation
"cqlsh --version",
// 10. Login to Cassandra
`cqlsh ${config.host} 10350 -u ${config.name} -p ${config.password} --ssl --protocol-version=4`
];
}
}

View File

@@ -0,0 +1,77 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Mongo shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class MongoShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Mongo;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "MongoDB";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.mongoEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`mongosh --host ${config.host} --port 10255 --username ${config.name} --password ${config.password} --tls --tlsAllowInvalidCertificates`
];
}
}

View File

@@ -0,0 +1,82 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* PostgreSQL shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class PostgresShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.Postgres;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "PostgreSQL";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.postgresqlEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if psql is installed; if not, proceed with installation
"if ! command -v psql &> /dev/null; then echo '⚠️ psql not found. Installing...'; fi",
// 3. Download PostgreSQL if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.postgresql.org/pub/source/v15.2/postgresql-15.2.tar.bz2; fi",
// 4. Extract PostgreSQL package if not installed
"if ! command -v psql &> /dev/null; then tar -xvjf postgresql-15.2.tar.bz2; fi",
// 5. Create a directory for PostgreSQL installation if not installed
"if ! command -v psql &> /dev/null; then mkdir -p ~/pgsql; fi",
// 6. Download readline (dependency for PostgreSQL) if not installed
"if ! command -v psql &> /dev/null; then curl -LO https://ftp.gnu.org/gnu/readline/readline-8.1.tar.gz; fi",
// 7. Extract readline package if not installed
"if ! command -v psql &> /dev/null; then tar -xvzf readline-8.1.tar.gz; fi",
// 8. Configure readline if not installed
"if ! command -v psql &> /dev/null; then cd readline-8.1 && ./configure --prefix=$HOME/pgsql; fi",
// 9. Add PostgreSQL to PATH if not installed
"if ! command -v psql &> /dev/null; then echo 'export PATH=$HOME/pgsql/bin:$PATH' >> ~/.bashrc; fi",
// 10. Source .bashrc to update PATH (even if psql was already installed)
"source ~/.bashrc",
// 11. Verify PostgreSQL installation
"psql --version",
`psql 'read -p "Enter Database Name: " dbname && read -p "Enter Username: " username && host=${config.endpoint} port=5432 dbname=$dbname user=$username sslmode=require'`
];
}
}

View File

@@ -0,0 +1,57 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* Factory for creating shell type handlers
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { CassandraShellHandler } from "./CassandraShellHandler";
import { MongoShellHandler } from "./MongoShellHandler";
import { PostgresShellHandler } from "./PostgresShellHandler";
import { VCoreMongoShellHandler } from "./VCoreMongoShellHandler";
export interface ShellTypeConfig {
getShellName(): string;
getInitialCommands(): Promise<string>;
configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}>;
}
export class ShellTypeHandler {
/**
* Gets the appropriate handler for the given shell type
*/
public static getHandler(shellType: TerminalKind): ShellTypeConfig {
switch (shellType) {
case TerminalKind.Postgres:
return new PostgresShellHandler();
case TerminalKind.Mongo:
return new MongoShellHandler();
case TerminalKind.VCoreMongo:
return new VCoreMongoShellHandler();
case TerminalKind.Cassandra:
return new CassandraShellHandler();
default:
throw new Error(`Unsupported shell type: ${shellType}`);
}
}
/**
* Gets the display name for a shell type
*/
public static getShellNameForDisplay(terminalKind: TerminalKind): string {
switch (terminalKind) {
case TerminalKind.Postgres:
return "PostgreSQL";
case TerminalKind.Mongo:
case TerminalKind.VCoreMongo:
return "MongoDB";
case TerminalKind.Cassandra:
return "Cassandra";
default:
return "";
}
}
}

View File

@@ -0,0 +1,78 @@
/**
* Copyright (c) Microsoft Corporation. All rights reserved.
* VCore MongoDB shell type handler
*/
import { Terminal } from "xterm";
import { TerminalKind } from "../../../../Contracts/ViewModels";
import { userContext } from "../../../../UserContext";
import { listKeys } from "../../../../Utils/arm/generatedClients/cosmos/databaseAccounts";
import { setShellType } from "../Data/CloudShellApiClient";
import { NetworkAccessHandler } from "../Network/NetworkAccessHandler";
import { getHostFromUrl } from "../Utils/CommonUtils";
import { ShellTypeConfig } from "./ShellTypeFactory";
export class VCoreMongoShellHandler implements ShellTypeConfig {
private shellType: TerminalKind = TerminalKind.VCoreMongo;
constructor() {
setShellType(this.shellType);
}
public getShellName(): string {
return "MongoDB VCore";
}
public async getInitialCommands(): Promise<string> {
const dbAccount = userContext.databaseAccount;
const endpoint = dbAccount.properties.vcoreMongoEndpoint;
// Get database key
const dbName = dbAccount.name;
let key = "";
if (dbName) {
const keys = await listKeys(userContext.subscriptionId, userContext.resourceGroup, dbName);
key = keys?.primaryMasterKey || "";
}
const config = {
host: getHostFromUrl(endpoint),
name: dbAccount.name,
password: key,
endpoint: endpoint
};
return this.getCommands(config).join("\n").concat("\n");
}
public async configureNetworkAccess(terminal: Terminal, region: string): Promise<{
vNetSettings: any;
isAllPublicAccessEnabled: boolean;
}> {
// VCore MongoDB uses private endpoints
return await NetworkAccessHandler.configureNetworkAccess(terminal, region, this.shellType);
}
private getCommands(config: any): string[] {
return [
// 1. Fetch and display location details in a readable format
"curl -s https://ipinfo.io | jq -r '\"Region: \" + .region + \" Country: \" + .country + \" City: \" + .city + \" IP Addr: \" + .ip'",
// 2. Check if mongosh is installed; if not, proceed with installation
"if ! command -v mongosh &> /dev/null; then echo '⚠️ mongosh not found. Installing...'; fi",
// 3. Download mongosh if not installed
"if ! command -v mongosh &> /dev/null; then curl -LO https://downloads.mongodb.com/compass/mongosh-2.3.8-linux-x64.tgz; fi",
// 4. Extract mongosh package if not installed
"if ! command -v mongosh &> /dev/null; then tar -xvzf mongosh-2.3.8-linux-x64.tgz; fi",
// 5. Move mongosh binaries if not installed
"if ! command -v mongosh &> /dev/null; then mkdir -p ~/mongosh && mv mongosh-2.3.8-linux-x64/* ~/mongosh/; fi",
// 6. Add mongosh to PATH if not installed
"if ! command -v mongosh &> /dev/null; then echo 'export PATH=$HOME/mongosh/bin:$PATH' >> ~/.bashrc; fi",
// 7. Source .bashrc to update PATH (even if mongosh was already installed)
"source ~/.bashrc",
// 8. Verify mongosh installation
"mongosh --version",
// 9. Login to MongoDB
`read -p "Enter username: " username && mongosh "mongodb+srv://$username:@${config.endpoint}/?authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000" --tls --tlsAllowInvalidCertificates`
];
}
}

Some files were not shown because too many files have changed in this diff Show More