Initial code commit
This commit is contained in:
parent
cec84d950e
commit
63070531cd
54
README.md
54
README.md
@ -1,14 +1,48 @@
|
|||||||
|
# Cosmos DB Language Service
|
||||||
|
|
||||||
# Contributing
|
Azure Cosmos DB Language Service for the Monaco editor
|
||||||
|
|
||||||
This project welcomes contributions and suggestions. Most contributions require you to agree to a
|
`npm install @azure/cosmos-language-service`
|
||||||
Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
|
|
||||||
the rights to use your contribution. For details, visit https://cla.microsoft.com.
|
|
||||||
|
|
||||||
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide
|
### Supported Cosmos DB languages
|
||||||
a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions
|
+ [Cosmos DB SQL](https://docs.microsoft.com/en-us/azure/cosmos-db/sql-api-sql-query-reference#bk_from_clause)
|
||||||
provided by the bot. You will only need to do this once across all repos using our CLA.
|
|
||||||
|
|
||||||
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
|
### Supported Features
|
||||||
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
|
+ Autocomplete
|
||||||
contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
|
+ Error marks
|
||||||
|
|
||||||
|
### Contributing
|
||||||
|
|
||||||
|
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.
|
||||||
|
|
||||||
|
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
|
||||||
|
|
||||||
|
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
|
||||||
|
|
||||||
|
#### Set up
|
||||||
|
1. Install antlr4 and set up environment based on the [doc](https://github.com/antlr/antlr4/blob/master/doc/getting-started.md)
|
||||||
|
2. If you are using `vscode`, hightly recommend the [antlr4 vscode plugin](https://marketplace.visualstudio.com/items?itemName=mike-lischke.vscode-antlr4)
|
||||||
|
3. Install `node`(>=v8.9.0) and `npm`(>=v5.8.0)
|
||||||
|
4. Clone the source code.(Source code repo link is https://github.com/Azure/cosmos-sql-language-service)
|
||||||
|
5. Install the dependencies:
|
||||||
|
```bash
|
||||||
|
npm install
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Build and Run
|
||||||
|
1. Compile Antlr Grammar
|
||||||
|
```bash
|
||||||
|
cd $(grammar_folder)
|
||||||
|
doskey antlr4=java org.antlr.v4.Tool $*
|
||||||
|
doskey grun =java org.antlr.v4.gui.TestRig $*
|
||||||
|
antlr4 -no-listener -no-visitor -Dlanguage=JavaScript *.g4 -o ../generated
|
||||||
|
```
|
||||||
|
You can keep the lexer.js file and the parser.js file and delete others.
|
||||||
|
|
||||||
|
2. Build the package
|
||||||
|
```bash
|
||||||
|
cd $(language_service_folder)
|
||||||
|
webpack
|
||||||
|
cd $(root_folder)
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
1052
dist/cosmosdb-sql/generated/CosmosDBSqlLexer.js
vendored
Normal file
1052
dist/cosmosdb-sql/generated/CosmosDBSqlLexer.js
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
dist/cosmosdb-sql/generated/CosmosDBSqlLexer.js.map
vendored
Normal file
1
dist/cosmosdb-sql/generated/CosmosDBSqlLexer.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
4178
dist/cosmosdb-sql/generated/CosmosDBSqlParser.js
vendored
Normal file
4178
dist/cosmosdb-sql/generated/CosmosDBSqlParser.js
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1
dist/cosmosdb-sql/generated/CosmosDBSqlParser.js.map
vendored
Normal file
1
dist/cosmosdb-sql/generated/CosmosDBSqlParser.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
127
dist/cosmosdb-sql/grammar/CosmosDBSqlKeywords.js
vendored
Normal file
127
dist/cosmosdb-sql/grammar/CosmosDBSqlKeywords.js
vendored
Normal file
@ -0,0 +1,127 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var CosmosDBSqlKeywords = /** @class */ (function () {
|
||||||
|
function CosmosDBSqlKeywords() {
|
||||||
|
}
|
||||||
|
CosmosDBSqlKeywords.KeywordTypeHintPrefix = "KeywordTypeHint:";
|
||||||
|
CosmosDBSqlKeywords.keywordsRegisteredForCompletion = {
|
||||||
|
"AND": "AND",
|
||||||
|
"ARRAY": "ARRAY",
|
||||||
|
"AS": "AS",
|
||||||
|
"ASC": "ASC",
|
||||||
|
"BETWEEN": "BETWEEN",
|
||||||
|
"BY": "BY",
|
||||||
|
"CASE": "CASE",
|
||||||
|
"CAST": "CAST",
|
||||||
|
"CONVERT": "CONVERT",
|
||||||
|
"CROSS": "CROSS",
|
||||||
|
"DESC": "DESC",
|
||||||
|
"DISTINCT": "DISTINCT",
|
||||||
|
"ELSE": "ELSE",
|
||||||
|
"END": "END",
|
||||||
|
"ESCAPE": "ESCAPE",
|
||||||
|
"EXISTS": "EXISTS",
|
||||||
|
"K_false": "false",
|
||||||
|
"FOR": "FOR",
|
||||||
|
"FROM": "FROM",
|
||||||
|
"GROUP": "GROUP",
|
||||||
|
"HAVING": "HAVING",
|
||||||
|
"IN": "IN",
|
||||||
|
"INNER": "INNER",
|
||||||
|
"INSERT": "INSERT",
|
||||||
|
"INTO": "INTO",
|
||||||
|
"IS": "IS",
|
||||||
|
"JOIN": "JOIN",
|
||||||
|
"LEFT": "LEFT",
|
||||||
|
"LIKE": "LIKE",
|
||||||
|
"LIMIT": "LIMIT",
|
||||||
|
"NOT": "NOT",
|
||||||
|
"K_null": "null",
|
||||||
|
"OFFSET": "OFFSET",
|
||||||
|
"ON": "ON",
|
||||||
|
"OR": "OR",
|
||||||
|
"ORDER": "ORDER",
|
||||||
|
"OUTER": "OUTER",
|
||||||
|
"OVER": "OVER",
|
||||||
|
"RIGHT": "RIGHT",
|
||||||
|
"SELECT": "SELECT",
|
||||||
|
"SET": "SET",
|
||||||
|
"THEN": "THEN",
|
||||||
|
"TOP": "TOP",
|
||||||
|
"K_true": "true",
|
||||||
|
"K_udf": "udf",
|
||||||
|
"K_undefined": "undefined",
|
||||||
|
"UPDATE": "UPDATE",
|
||||||
|
"VALUE": "VALUE",
|
||||||
|
"WHEN": "WHEN",
|
||||||
|
"WHERE": "WHERE",
|
||||||
|
"WITH": "WITH",
|
||||||
|
"Infinity": "Infinity",
|
||||||
|
"NaN": "NaN",
|
||||||
|
"ABS": "ABS",
|
||||||
|
"ACOS": "ACOS",
|
||||||
|
"ARRAY_CONCAT": "ARRAY_CONCAT",
|
||||||
|
"ARRAY_CONTAINS": "ARRAY_CONTAINS",
|
||||||
|
"ARRAY_LENGTH": "ARRAY_LENGTH",
|
||||||
|
"ARRAY_SLICE": "ARRAY_SLICE",
|
||||||
|
"ASIN": "ASIN",
|
||||||
|
"ATAN": "ATAN",
|
||||||
|
"ATN2": "ATN2",
|
||||||
|
"AVG": "AVG",
|
||||||
|
"CEILING": "CEILING",
|
||||||
|
"CONCAT": "CONCAT",
|
||||||
|
"CONTAINS": "CONTAINS",
|
||||||
|
"COS": "COS",
|
||||||
|
"COT": "COT",
|
||||||
|
"COUNT": "COUNT",
|
||||||
|
"DEGREES": "DEGREES",
|
||||||
|
"ENDSWITH": "ENDSWITH",
|
||||||
|
"EXP": "EXP",
|
||||||
|
"FLOOR": "FLOOR",
|
||||||
|
"INDEX_OF": "INDEX_OF",
|
||||||
|
"S_ARRAY": "S_ARRAY",
|
||||||
|
"IS_BOOL": "IS_BOOL",
|
||||||
|
"IS_DEFINED": "IS_DEFINED",
|
||||||
|
"IS_FINITE_NUMBER": "IS_FINITE_NUMBER",
|
||||||
|
"IS_NULL": "IS_NULL",
|
||||||
|
"IS_NUMBER": "IS_NUMBER",
|
||||||
|
"IS_OBJECT": "IS_OBJECT",
|
||||||
|
"IS_PRIMITIVE": "IS_PRIMITIVE",
|
||||||
|
"IS_STRING": "IS_STRING",
|
||||||
|
"LENGTH": "LENGTH",
|
||||||
|
"LOG10": "LOG10",
|
||||||
|
"LOWER": "LOWER",
|
||||||
|
"LTRIM": "LTRIM",
|
||||||
|
"MAX": "MAX",
|
||||||
|
"MIN": "MIN",
|
||||||
|
"PI": "PI",
|
||||||
|
"POWER": "POWER",
|
||||||
|
"RADIANS": "RADIANS",
|
||||||
|
"RAND": "RAND",
|
||||||
|
"REPLACE": "REPLACE",
|
||||||
|
"REPLICATE": "REPLICATE",
|
||||||
|
"REVERSE": "REVERSE",
|
||||||
|
"ROUND": "ROUND",
|
||||||
|
"RTRIM": "RTRIM",
|
||||||
|
"SIGN": "SIGN",
|
||||||
|
"SIN": "SIN",
|
||||||
|
"SQRT": "SQRT",
|
||||||
|
"SQUARE": "SQUARE",
|
||||||
|
"ST_DISTANCE": "ST_DISTANCE",
|
||||||
|
"ST_INTERSECTS": "ST_INTERSECTS",
|
||||||
|
"ST_ISVALID": "ST_ISVALID",
|
||||||
|
"ST_ISVALIDDETAILED": "ST_ISVALIDDETAILED",
|
||||||
|
"ST_WITHIN": "ST_WITHIN",
|
||||||
|
"STARTSWITH": "STARTSWITH",
|
||||||
|
"SUBSTRING": "SUBSTRING",
|
||||||
|
"SUM": "SUM",
|
||||||
|
"TAN": "TAN",
|
||||||
|
"TRUNC": "TRUNC",
|
||||||
|
"UPPER": "UPPER",
|
||||||
|
"ID": CosmosDBSqlKeywords.KeywordTypeHintPrefix + "ID",
|
||||||
|
"NUMBER": CosmosDBSqlKeywords.KeywordTypeHintPrefix + "NUMBER"
|
||||||
|
};
|
||||||
|
return CosmosDBSqlKeywords;
|
||||||
|
}());
|
||||||
|
exports.CosmosDBSqlKeywords = CosmosDBSqlKeywords;
|
||||||
|
//# sourceMappingURL=CosmosDBSqlKeywords.js.map
|
1
dist/cosmosdb-sql/grammar/CosmosDBSqlKeywords.js.map
vendored
Normal file
1
dist/cosmosdb-sql/grammar/CosmosDBSqlKeywords.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"CosmosDBSqlKeywords.js","sourceRoot":"","sources":["../../../src/cosmosdb-sql/grammar/CosmosDBSqlKeywords.ts"],"names":[],"mappings":";;AAAA;IAAA;IA2HA,CAAC;IAzHkB,yCAAqB,GAAY,kBAAkB,CAAC;IAErD,mDAA+B,GAC7C;QACI,KAAK,EAAE,KAAK;QACZ,OAAO,EAAE,OAAO;QAChB,IAAI,EAAE,IAAI;QACV,KAAK,EAAE,KAAK;QACZ,SAAS,EAAE,SAAS;QACpB,IAAI,EAAE,IAAI;QACV,MAAM,EAAE,MAAM;QACd,MAAM,EAAE,MAAM;QACd,SAAS,EAAE,SAAS;QACpB,OAAO,EAAE,OAAO;QAChB,MAAM,EAAE,MAAM;QACd,UAAU,EAAE,UAAU;QACtB,MAAM,EAAE,MAAM;QACd,KAAK,EAAE,KAAK;QACZ,QAAQ,EAAE,QAAQ;QAClB,QAAQ,EAAE,QAAQ;QAClB,SAAS,EAAE,OAAO;QAClB,KAAK,EAAE,KAAK;QACZ,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,OAAO;QAChB,QAAQ,EAAE,QAAQ;QAClB,IAAI,EAAE,IAAI;QACV,OAAO,EAAE,OAAO;QAChB,QAAQ,EAAE,QAAQ;QAClB,MAAM,EAAE,MAAM;QACd,IAAI,EAAE,IAAI;QACV,MAAM,EAAE,MAAM;QACd,MAAM,EAAE,MAAM;QACd,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,OAAO;QAChB,KAAK,EAAE,KAAK;QACZ,QAAQ,EAAE,MAAM;QAChB,QAAQ,EAAE,QAAQ;QAClB,IAAI,EAAE,IAAI;QACV,IAAI,EAAE,IAAI;QACV,OAAO,EAAE,OAAO;QAChB,OAAO,EAAE,OAAO;QAChB,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,OAAO;QAChB,QAAQ,EAAE,QAAQ;QAClB,KAAK,EAAE,KAAK;QACZ,MAAM,EAAE,MAAM;QACd,KAAK,EAAE,KAAK;QACZ,QAAQ,EAAE,MAAM;QAChB,OAAO,EAAE,KAAK;QACd,aAAa,EAAE,WAAW;QAC1B,QAAQ,EAAE,QAAQ;QAClB,OAAO,EAAE,OAAO;QAChB,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,OAAO;QAChB,MAAM,EAAE,MAAM;QACd,UAAU,EAAE,UAAU;QACtB,KAAK,EAAE,KAAK;QAEZ,KAAK,EAAE,KAAK;QACZ,MAAM,EAAE,MAAM;QACd,cAAc,EAAE,cAAc;QAC9B,gBAAgB,EAAE,gBAAgB;QAClC,cAAc,EAAE,cAAc;QAC9B,aAAa,EAAE,aAAa;QAC5B,MAAM,EAAE,MAAM;QACd,MAAM,EAAE,MAAM;QACd,MAAM,EAAE,MAAM;QACd,KAAK,EAAE,KAAK;QACZ,SAAS,EAAE,SAAS;QACpB,QAAQ,EAAE,QAAQ;QAClB,UAAU,EAAE,UAAU;QACtB,KAAK,EAAE,KAAK;QACZ,KAAK,EAAE,KAAK;QACZ,OAAO,EAAE,OAAO;QAChB,SAAS,EAAE,SAAS;QACpB,UAAU,EAAE,UAAU;QACtB,KAAK,EAAE,KAAK;QACZ,OAAO,EAAE,OAAO;QAChB,UAAU,EAAE,UAAU;QACtB,SAAS,EAAE,SAAS;QACpB,SAAS,EAAE,SAAS;QACpB,YAAY,EAAE,YAAY;QAC1B,kBAAkB,EAAE,kBAAkB;QACtC,SAAS,EAAE,SAAS;QACpB,WAAW,EAAE,WAAW;QACxB,WAAW,EAAE,WAAW;QACxB,cAAc,EAAE,cAAc;QAC9B,WAAW,EAAE,WAAW;QACxB,QAAQ,EAAE,QAAQ;QAClB,OAAO,EAAE,OAAO;QAChB,OAAO,EAAE,OAAO;QAChB,OAAO,EAAE,OAAO;QAChB,KAAK,EAAE,KAAK;QACZ,KAAK,EAAE,KAAK;QACZ,IAAI,EAAE,IAAI;QACV,OAAO,EAAE,OAAO;QAChB,SAAS,EAAE,SAAS;QACpB,MAAM,EAAE,MAAM;QACd,SAAS,EAAE,SAAS;QACpB,WAAW,EAAE,WAAW;QACxB,SAAS,EAAE,SAAS;QACpB,OAAO,EAAE,OAAO;QAChB,OAAO,EAAE,OAAO;QAChB,MAAM,EAAE,MAAM;QACd,KAAK,EAAE,KAAK;QACZ,MAAM,EAAE,MAAM;QACd,QAAQ,EAAE,QAAQ;QAClB,aAAa,EAAE,aAAa;QAC5B,eAAe,EAAE,eAAe;QAChC,YAAY,EAAE,YAAY;QAC1B,oBAAoB,EAAE,oBAAoB;QAC1C,WAAW,EAAE,WAAW;QACxB,YAAY,EAAE,YAAY;QAC1B,WAAW,EAAE,WAAW;QACxB,KAAK,EAAE,KAAK;QACZ,KAAK,EAAE,KAAK;QACZ,OAAO,EAAE,OAAO;QAChB,OAAO,EAAE,OAAO;QAChB,IAAI,EAAE,mBAAmB,CAAC,qBAAqB,GAAG,IAAI;QACtD,QAAQ,EAAE,mBAAmB,CAAC,qBAAqB,GAAG,QAAQ;KACjE,CAAC;IACN,0BAAC;CAAA,AA3HD,IA2HC;AA3HY,kDAAmB"}
|
74
dist/facade/LanguageServiceFacade.js
vendored
Normal file
74
dist/facade/LanguageServiceFacade.js
vendored
Normal file
@ -0,0 +1,74 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var Q = require("q");
|
||||||
|
var monaco = require("monaco-editor");
|
||||||
|
var ParseReason;
|
||||||
|
(function (ParseReason) {
|
||||||
|
ParseReason[ParseReason["GetCompletionWords"] = 1] = "GetCompletionWords";
|
||||||
|
ParseReason[ParseReason["GetErrors"] = 2] = "GetErrors";
|
||||||
|
})(ParseReason = exports.ParseReason || (exports.ParseReason = {}));
|
||||||
|
var LanguageServiceFacade = /** @class */ (function () {
|
||||||
|
function LanguageServiceFacade() {
|
||||||
|
}
|
||||||
|
LanguageServiceFacade.GetLanguageServiceParseResult = function (str, parseReason) {
|
||||||
|
var timeExceeded = Q.Promise(function (resolve, reject) {
|
||||||
|
var wait = setTimeout(function () {
|
||||||
|
var words = [];
|
||||||
|
resolve(words);
|
||||||
|
}, LanguageServiceFacade.timeout);
|
||||||
|
});
|
||||||
|
var result = LanguageServiceFacade.GetParseResult(str, parseReason);
|
||||||
|
return Q.race([timeExceeded, result]).then(function (words) {
|
||||||
|
LanguageServiceFacade.workingWorker.terminate();
|
||||||
|
return words;
|
||||||
|
});
|
||||||
|
};
|
||||||
|
LanguageServiceFacade.timeout = 2000;
|
||||||
|
LanguageServiceFacade.workingWorker = null;
|
||||||
|
LanguageServiceFacade.GetParseResult = function (str, parseReason) {
|
||||||
|
return Q.Promise(function (resolve) {
|
||||||
|
if (LanguageServiceFacade.workingWorker != null) {
|
||||||
|
LanguageServiceFacade.workingWorker.terminate();
|
||||||
|
}
|
||||||
|
var currentUrlWithoutQueryParamsAndHashRoute = "http://" + window.location.host + window.location.pathname;
|
||||||
|
var url = currentUrlWithoutQueryParamsAndHashRoute.replace(/\/[^\/]*$/, '/node_modules/cosmosdb-language-service/dist/worker/dist/LanguageServiceWorker.js');
|
||||||
|
LanguageServiceFacade.workingWorker = new Worker(url);
|
||||||
|
LanguageServiceFacade.workingWorker.onmessage = function (ev) {
|
||||||
|
var processedResults = [];
|
||||||
|
var results = ev.data;
|
||||||
|
if (parseReason === ParseReason.GetCompletionWords) {
|
||||||
|
results.forEach(function (label) {
|
||||||
|
if (!!label) {
|
||||||
|
processedResults.push({
|
||||||
|
label: label,
|
||||||
|
kind: monaco.languages.CompletionItemKind.Keyword
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
else if (parseReason === ParseReason.GetErrors) {
|
||||||
|
results.forEach(function (err) {
|
||||||
|
var mark = {
|
||||||
|
severity: monaco.MarkerSeverity.Error,
|
||||||
|
message: err.Message,
|
||||||
|
startLineNumber: err.line,
|
||||||
|
startColumn: err.column,
|
||||||
|
endLineNumber: err.line,
|
||||||
|
endColumn: err.column
|
||||||
|
};
|
||||||
|
processedResults.push(mark);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
resolve(processedResults);
|
||||||
|
};
|
||||||
|
var source = {
|
||||||
|
code: str,
|
||||||
|
reason: parseReason
|
||||||
|
};
|
||||||
|
LanguageServiceFacade.workingWorker.postMessage(source);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
return LanguageServiceFacade;
|
||||||
|
}());
|
||||||
|
exports.LanguageServiceFacade = LanguageServiceFacade;
|
||||||
|
//# sourceMappingURL=LanguageServiceFacade.js.map
|
1
dist/facade/LanguageServiceFacade.js.map
vendored
Normal file
1
dist/facade/LanguageServiceFacade.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"LanguageServiceFacade.js","sourceRoot":"","sources":["../../src/facade/LanguageServiceFacade.ts"],"names":[],"mappings":";;AAAA,qBAAuB;AACvB,sCAAwC;AAExC,IAAY,WAGX;AAHD,WAAY,WAAW;IACnB,yEAAsB,CAAA;IACtB,uDAAa,CAAA;AACjB,CAAC,EAHW,WAAW,GAAX,mBAAW,KAAX,mBAAW,QAGtB;AAED;IAAA;IAsEA,CAAC;IAjEiB,mDAA6B,GAA3C,UAA4C,GAAY,EAAE,WAAyB;QAC/E,IAAM,YAAY,GAAG,CAAC,CAAC,OAAO,CAAQ,UAAC,OAAa,EAAE,MAAY;YAC9D,IAAM,IAAI,GAAG,UAAU,CAAC;gBACpB,IAAM,KAAK,GAAS,EAAE,CAAC;gBACvB,OAAO,CAAC,KAAK,CAAC,CAAC;YACnB,CAAC,EAAE,qBAAqB,CAAC,OAAO,CAAC,CAAC;QACtC,CAAC,CAAC,CAAC;QAEH,IAAM,MAAM,GAAG,qBAAqB,CAAC,cAAc,CAAC,GAAG,EAAE,WAAW,CAAC,CAAC;QACtE,OAAO,CAAC,CAAC,IAAI,CAAC,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,UAAS,KAAK;YACrD,qBAAqB,CAAC,aAAa,CAAC,SAAS,EAAE,CAAC;YAChD,OAAO,KAAK,CAAC;QACjB,CAAC,CAAC,CAAC;IACP,CAAC;IAjBc,6BAAO,GAAY,IAAI,CAAC;IAExB,mCAAa,GAAY,IAAI,CAAC;IAiB9B,oCAAc,GAAG,UAAC,GAAY,EAAE,WAAyB;QACpE,OAAO,CAAC,CAAC,OAAO,CAAC,UAAS,OAAa;YAEnC,IAAI,qBAAqB,CAAC,aAAa,IAAI,IAAI,EAAE;gBAC7C,qBAAqB,CAAC,aAAa,CAAC,SAAS,EAAE,CAAC;aACnD;YAED,IAAM,wCAAwC,GAAW,YAAU,MAAM,CAAC,QAAQ,CAAC,IAAI,GAAG,MAAM,CAAC,QAAQ,CAAC,QAAU,CAAC;YACrH,IAAI,GAAG,GAAG,wCAAwC,CAAC,OAAO,CAAC,WAAW,EAAE,mFAAmF,CAAC,CAAC;YAC7J,qBAAqB,CAAC,aAAa,GAAG,IAAI,MAAM,CAAC,GAAG,CAAC,CAAC;YAEtD,qBAAqB,CAAC,aAAa,CAAC,SAAS,GAAG,UAAC,EAAiB;gBAC9D,IAAI,gBAAgB,GAAQ,EAAE,CAAC;gBAE/B,IAAI,OAAO,GAAW,EAAE,CAAC,IAAI,CAAC;gBAE9B,IAAI,WAAW,KAAK,WAAW,CAAC,kBAAkB,EAAE;oBAChD,OAAO,CAAC,OAAO,CAAC,UAAC,KAAa;wBAC1B,IAAI,CAAC,CAAC,KAAK,EAAE;4BACT,gBAAgB,CAAC,IAAI,CAAC;gCAClB,KAAK,EAAE,KAAK;gCACZ,IAAI,EAAE,MAAM,CAAC,SAAS,CAAC,kBAAkB,CAAC,OAAO;6BACpD,CAAC,CAAC;yBACN;oBACL,CAAC,CAAC,CAAC;iBACN;qBAAM,IAAI,WAAW,KAAK,WAAW,CAAC,SAAS,EAAE;oBAC9C,OAAO,CAAC,OAAO,CAAC,UAAC,GAAQ;wBACrB,IAAM,IAAI,GAA8B;4BACpC,QAAQ,EAAE,MAAM,CAAC,cAAc,CAAC,KAAK;4BACrC,OAAO,EAAE,GAAG,CAAC,OAAO;4BACpB,eAAe,EAAE,GAAG,CAAC,IAAI;4BACzB,WAAW,EAAE,GAAG,CAAC,MAAM;4BACvB,aAAa,EAAE,GAAG,CAAC,IAAI;4BACvB,SAAS,EAAE,GAAG,CAAC,MAAM;yBACxB,CAAC;wBAEF,gBAAgB,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;oBAC/B,CAAC,CAAC,CAAC;iBACN;gBAED,OAAO,CAAC,gBAAgB,CAAC,CAAC;YAC9B,CAAC,CAAA;YAED,IAAM,MAAM,GAAG;gBACX,IAAI,EAAG,GAAG;gBACV,MAAM,EAAG,WAAW;aACvB,CAAC;YACF,qBAAqB,CAAC,aAAa,CAAC,WAAW,CAAC,MAAM,CAAC,CAAC;QAC5D,CAAC,CAAC,CAAC;IACP,CAAC,CAAA;IACL,4BAAC;CAAA,AAtED,IAsEC;AAtEY,sDAAqB"}
|
43
dist/language-service/LSCommonTokenStream.js
vendored
Normal file
43
dist/language-service/LSCommonTokenStream.js
vendored
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
"use strict";
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
var __extends = (this && this.__extends) || (function () {
|
||||||
|
var extendStatics = function (d, b) {
|
||||||
|
extendStatics = Object.setPrototypeOf ||
|
||||||
|
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
|
||||||
|
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
|
||||||
|
return extendStatics(d, b);
|
||||||
|
}
|
||||||
|
return function (d, b) {
|
||||||
|
extendStatics(d, b);
|
||||||
|
function __() { this.constructor = d; }
|
||||||
|
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
|
||||||
|
};
|
||||||
|
})();
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var CommonTokenStream_1 = require("antlr4/CommonTokenStream");
|
||||||
|
var Token_1 = require("antlr4/Token");
|
||||||
|
var LSCommonTokenStream = /** @class */ (function (_super) {
|
||||||
|
__extends(LSCommonTokenStream, _super);
|
||||||
|
function LSCommonTokenStream(tokenSource) {
|
||||||
|
return _super.call(this, tokenSource) || this;
|
||||||
|
}
|
||||||
|
LSCommonTokenStream.prototype.LA = function (i) {
|
||||||
|
var token = _super.prototype.LA.call(this, i);
|
||||||
|
if (token != null && token == Token_1.Token.EOF && this.EofListener != undefined) {
|
||||||
|
this.EofListener();
|
||||||
|
}
|
||||||
|
return token;
|
||||||
|
};
|
||||||
|
LSCommonTokenStream.prototype.LT = function (i) {
|
||||||
|
var token = _super.prototype.LT.call(this, i);
|
||||||
|
if (token != null && token.type == Token_1.Token.EOF && this.EofListener != undefined) {
|
||||||
|
this.EofListener();
|
||||||
|
}
|
||||||
|
return token;
|
||||||
|
};
|
||||||
|
return LSCommonTokenStream;
|
||||||
|
}(CommonTokenStream_1.CommonTokenStream));
|
||||||
|
exports.LSCommonTokenStream = LSCommonTokenStream;
|
||||||
|
//# sourceMappingURL=LSCommonTokenStream.js.map
|
1
dist/language-service/LSCommonTokenStream.js.map
vendored
Normal file
1
dist/language-service/LSCommonTokenStream.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"LSCommonTokenStream.js","sourceRoot":"","sources":["../../src/language-service/LSCommonTokenStream.ts"],"names":[],"mappings":";AAAA,+EAA+E;AAC/E,6DAA6D;AAC7D,+EAA+E;;;;;;;;;;;;;;;AAE/E,8DAA6D;AAE7D,sCAAqC;AAErC;IAAyC,uCAAiB;IAGtD,6BAAY,WAAmB;eAC3B,kBAAM,WAAW,CAAC;IACtB,CAAC;IAEM,gCAAE,GAAT,UAAU,CAAU;QAChB,IAAI,KAAK,GAAY,iBAAM,EAAE,YAAC,CAAC,CAAC,CAAC;QAEjC,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,IAAI,aAAK,CAAC,GAAG,IAAI,IAAI,CAAC,WAAW,IAAI,SAAS,EAAE;YACtE,IAAI,CAAC,WAAW,EAAE,CAAC;SACtB;QACD,OAAO,KAAK,CAAC;IACjB,CAAC;IAEM,gCAAE,GAAT,UAAU,CAAU;QAChB,IAAI,KAAK,GAAG,iBAAM,EAAE,YAAC,CAAC,CAAC,CAAC;QAExB,IAAI,KAAK,IAAI,IAAI,IAAI,KAAK,CAAC,IAAI,IAAI,aAAK,CAAC,GAAG,IAAI,IAAI,CAAC,WAAW,IAAI,SAAS,EAAE;YAC3E,IAAI,CAAC,WAAW,EAAE,CAAC;SACtB;QACD,OAAO,KAAK,CAAC;IACjB,CAAC;IACL,0BAAC;AAAD,CAAC,AAxBD,CAAyC,qCAAiB,GAwBzD;AAxBY,kDAAmB"}
|
33
dist/language-service/LSErrorListener.js
vendored
Normal file
33
dist/language-service/LSErrorListener.js
vendored
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
"use strict";
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
var __extends = (this && this.__extends) || (function () {
|
||||||
|
var extendStatics = function (d, b) {
|
||||||
|
extendStatics = Object.setPrototypeOf ||
|
||||||
|
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
|
||||||
|
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
|
||||||
|
return extendStatics(d, b);
|
||||||
|
}
|
||||||
|
return function (d, b) {
|
||||||
|
extendStatics(d, b);
|
||||||
|
function __() { this.constructor = d; }
|
||||||
|
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
|
||||||
|
};
|
||||||
|
})();
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var ErrorListener_1 = require("antlr4/error/ErrorListener");
|
||||||
|
var LSErrorListener = /** @class */ (function (_super) {
|
||||||
|
__extends(LSErrorListener, _super);
|
||||||
|
function LSErrorListener(AddSyntaxError) {
|
||||||
|
var _this = _super.call(this) || this;
|
||||||
|
_this.AddSyntaxError = AddSyntaxError;
|
||||||
|
return _this;
|
||||||
|
}
|
||||||
|
LSErrorListener.prototype.syntaxError = function (recognizer, offendingSymbol, line, column, msg, e) {
|
||||||
|
this.AddSyntaxError(msg, line, column);
|
||||||
|
};
|
||||||
|
return LSErrorListener;
|
||||||
|
}(ErrorListener_1.ErrorListener));
|
||||||
|
exports.LSErrorListener = LSErrorListener;
|
||||||
|
//# sourceMappingURL=LSErrorListener.js.map
|
1
dist/language-service/LSErrorListener.js.map
vendored
Normal file
1
dist/language-service/LSErrorListener.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"LSErrorListener.js","sourceRoot":"","sources":["../../src/language-service/LSErrorListener.ts"],"names":[],"mappings":";AAAA,+EAA+E;AAC/E,6DAA6D;AAC7D,+EAA+E;;;;;;;;;;;;;;;AAE/E,4DAA2D;AAE3D;IAAqC,mCAAa;IAG9C,yBAAY,cAAsE;QAAlF,YACI,iBAAO,SAEV;QADG,KAAI,CAAC,cAAc,GAAG,cAAc,CAAC;;IACzC,CAAC;IAEM,qCAAW,GAAlB,UAAmB,UAAe,EAAE,eAAoB,EAAC,IAAY,EAAE,MAAc,EAAE,GAAW,EAAE,CAAM;QACtG,IAAI,CAAC,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE,MAAM,CAAC,CAAC;IAC3C,CAAC;IACL,sBAAC;AAAD,CAAC,AAXD,CAAqC,6BAAa,GAWjD;AAXY,0CAAe"}
|
272
dist/language-service/LSParserATNSimulator.js
vendored
Normal file
272
dist/language-service/LSParserATNSimulator.js
vendored
Normal file
@ -0,0 +1,272 @@
|
|||||||
|
"use strict";
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
var __extends = (this && this.__extends) || (function () {
|
||||||
|
var extendStatics = function (d, b) {
|
||||||
|
extendStatics = Object.setPrototypeOf ||
|
||||||
|
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
|
||||||
|
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
|
||||||
|
return extendStatics(d, b);
|
||||||
|
}
|
||||||
|
return function (d, b) {
|
||||||
|
extendStatics(d, b);
|
||||||
|
function __() { this.constructor = d; }
|
||||||
|
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
|
||||||
|
};
|
||||||
|
})();
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var ATNState = require("antlr4/atn/ATNState");
|
||||||
|
var Transition = require("antlr4/atn/Transition");
|
||||||
|
var Errors_1 = require("antlr4/error/Errors");
|
||||||
|
var ParserATNSimulator_1 = require("antlr4/atn/ParserATNSimulator");
|
||||||
|
var PredictionMode_1 = require("antlr4/atn/PredictionMode");
|
||||||
|
var Token_1 = require("antlr4/Token");
|
||||||
|
var Utils_1 = require("./Utils");
|
||||||
|
var LSParserATNSimulator = /** @class */ (function (_super) {
|
||||||
|
__extends(LSParserATNSimulator, _super);
|
||||||
|
function LSParserATNSimulator(parser, atn, decisionToDFA, sharedContextCache, languageService) {
|
||||||
|
var _this = _super.call(this, parser, atn, decisionToDFA, sharedContextCache) || this;
|
||||||
|
_this.predictionMode = PredictionMode_1.PredictionMode.LL;
|
||||||
|
_this.parser = parser;
|
||||||
|
_this.atn = atn;
|
||||||
|
_this.languageService = languageService;
|
||||||
|
return _this;
|
||||||
|
}
|
||||||
|
LSParserATNSimulator.prototype.adaptivePredict = function (input, decision, outerContext) {
|
||||||
|
var tokensLeft = -1;
|
||||||
|
try {
|
||||||
|
this.languageService.IsInPredict = true;
|
||||||
|
this.languageService.EofReachedInPredict = false;
|
||||||
|
if (decision >= 0) {
|
||||||
|
return _super.prototype.adaptivePredict.call(this, input, decision, outerContext);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (error) {
|
||||||
|
if (error instanceof Errors_1.NoViableAltException && error.offendingToken.type === Token_1.Token.EOF) {
|
||||||
|
tokensLeft = error.offendingToken.tokenIndex - this.parser.getCurrentToken().tokenIndex;
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
finally {
|
||||||
|
if (this.languageService.EofReachedInPredict) {
|
||||||
|
if (tokensLeft < 0) {
|
||||||
|
tokensLeft = 0;
|
||||||
|
while (input.LA(tokensLeft + 1) != Token_1.Token.EOF) {
|
||||||
|
tokensLeft++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (tokensLeft > 0) {
|
||||||
|
var states = this.CalculateValidStates(input, tokensLeft);
|
||||||
|
this.languageService.RecordErrorStatesBeforeEof(states);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.languageService.IsInPredict = false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.CalculateValidStates = function (input, tokensLeft) {
|
||||||
|
var _this = this;
|
||||||
|
var state = this.atn.states[this.parser.state];
|
||||||
|
var states = [{
|
||||||
|
state: state,
|
||||||
|
transitionStates: []
|
||||||
|
}];
|
||||||
|
var validStates = [];
|
||||||
|
var _loop_1 = function (index) {
|
||||||
|
var _states = [];
|
||||||
|
var nextToken = input.LA(index);
|
||||||
|
states.forEach(function (s) { _states = _states.concat(_this.ConsumeSingleTokenAhead(s, nextToken)).filter(Utils_1.Utils.notDuplicate); });
|
||||||
|
states = _states.filter(Utils_1.Utils.notDuplicate);
|
||||||
|
};
|
||||||
|
// one step each time. Consume a single token each time.
|
||||||
|
for (var index = 1; index <= tokensLeft; index++) {
|
||||||
|
_loop_1(index);
|
||||||
|
}
|
||||||
|
states.forEach(function (s) { validStates = validStates.concat(_this.SearchValidStates(s)); });
|
||||||
|
return validStates.map(function (s) { return s.state; }).filter(Utils_1.Utils.notDuplicate);
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.ConsumeSingleTokenAhead = function (stateWithTransitionPath, matchToken) {
|
||||||
|
var validStates = [];
|
||||||
|
var currentState = stateWithTransitionPath.state;
|
||||||
|
var nextStateWithTransitionPath = {
|
||||||
|
state: null,
|
||||||
|
transitionStates: stateWithTransitionPath.transitionStates.slice()
|
||||||
|
};
|
||||||
|
if (nextStateWithTransitionPath.transitionStates.length > 0 &&
|
||||||
|
nextStateWithTransitionPath.transitionStates[nextStateWithTransitionPath.transitionStates.length - 1].ruleIndex === currentState.ruleIndex) {
|
||||||
|
nextStateWithTransitionPath.transitionStates.pop();
|
||||||
|
}
|
||||||
|
nextStateWithTransitionPath.transitionStates.push(currentState);
|
||||||
|
if (!(currentState instanceof ATNState.RuleStopState)) {
|
||||||
|
for (var index = 0; index < currentState.transitions.length; index++) {
|
||||||
|
var transition = currentState.transitions[index];
|
||||||
|
var destinationChildState = transition.target;
|
||||||
|
nextStateWithTransitionPath.state = destinationChildState;
|
||||||
|
if (!transition.isEpsilon) {
|
||||||
|
if (transition.label != null && transition.label.contains(matchToken)) {
|
||||||
|
validStates = validStates.concat(this.SearchValidStates(nextStateWithTransitionPath));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
validStates = validStates.concat(this.ConsumeSingleTokenAhead(nextStateWithTransitionPath, matchToken)).filter(Utils_1.Utils.notDuplicate);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return validStates.filter(Utils_1.Utils.notEmpty);
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.SearchValidStates = function (stateWithTransitionPath) {
|
||||||
|
var validStates = [];
|
||||||
|
if (!this.IsLastStateBeforeRuleStopState(stateWithTransitionPath.state)) {
|
||||||
|
validStates.push(stateWithTransitionPath);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
validStates = this.BackTracingAndFindActiveStates(stateWithTransitionPath).filter(Utils_1.Utils.notDuplicate);
|
||||||
|
if (this.HasActiveChildrenState(stateWithTransitionPath.state)) {
|
||||||
|
validStates.push(stateWithTransitionPath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return validStates;
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.BackTracingAndFindActiveStates = function (stateWithTransitionPath) {
|
||||||
|
var validStates = [];
|
||||||
|
var completedRuleIndex = stateWithTransitionPath.state.ruleIndex;
|
||||||
|
var statesStack = this.GetLastStateInDifferentRulesFomStatesStack(stateWithTransitionPath.transitionStates, completedRuleIndex);
|
||||||
|
var currentStateIndex = statesStack.length - 1;
|
||||||
|
var keepBackTracing = true;
|
||||||
|
while (keepBackTracing && currentStateIndex >= 0) {
|
||||||
|
var currentState = statesStack[currentStateIndex];
|
||||||
|
keepBackTracing = false;
|
||||||
|
var followingStates = this.GetRuleFollowingState(currentState, completedRuleIndex);
|
||||||
|
for (var index = 0; index < followingStates.length; index++) {
|
||||||
|
var lastStateBeforeRuleStopState = false;
|
||||||
|
var haveActiveChildrenStatesInCurrentRule = false;
|
||||||
|
var transitions = followingStates[index].transitions;
|
||||||
|
while (transitions.length > 0) {
|
||||||
|
var epsilonTrans = [];
|
||||||
|
for (var tIndex = 0; tIndex < transitions.length; tIndex++) {
|
||||||
|
if (transitions[tIndex].isEpsilon) {
|
||||||
|
if (transitions[tIndex] instanceof Transition.RuleTransition) {
|
||||||
|
haveActiveChildrenStatesInCurrentRule = true;
|
||||||
|
}
|
||||||
|
else if (transitions[tIndex].target instanceof ATNState.RuleStopState) {
|
||||||
|
lastStateBeforeRuleStopState = true;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[tIndex].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
haveActiveChildrenStatesInCurrentRule = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
if (lastStateBeforeRuleStopState && haveActiveChildrenStatesInCurrentRule) {
|
||||||
|
// We can jump out of loop ahead of schedule.
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (lastStateBeforeRuleStopState) {
|
||||||
|
keepBackTracing = true;
|
||||||
|
}
|
||||||
|
if (haveActiveChildrenStatesInCurrentRule) {
|
||||||
|
//validStates.push(followingStates[index]);
|
||||||
|
var newValidState = {
|
||||||
|
state: followingStates[index],
|
||||||
|
transitionStates: statesStack.slice(0, currentStateIndex + 1)
|
||||||
|
};
|
||||||
|
validStates.push(newValidState);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
currentStateIndex--;
|
||||||
|
if (keepBackTracing) {
|
||||||
|
completedRuleIndex = followingStates[0].ruleIndex;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return validStates.filter(Utils_1.Utils.notEmpty);
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.GetLastStateInDifferentRulesFomStatesStack = function (statesStack, lastMatchedRuleIndex) {
|
||||||
|
var lastStates = [];
|
||||||
|
var matchedRuleIndex = lastMatchedRuleIndex;
|
||||||
|
for (var currentStateIndex = statesStack.length - 1; currentStateIndex >= 0; currentStateIndex--) {
|
||||||
|
if (statesStack[currentStateIndex].ruleIndex === matchedRuleIndex) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
lastStates.push(statesStack[currentStateIndex]);
|
||||||
|
matchedRuleIndex = statesStack[currentStateIndex].ruleIndex;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
lastStates.reverse();
|
||||||
|
return lastStates.filter(Utils_1.Utils.notEmpty);
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.GetRuleFollowingState = function (state, ruleIndex) {
|
||||||
|
var followingStates = [];
|
||||||
|
if (state instanceof ATNState.RuleStopState) {
|
||||||
|
return followingStates;
|
||||||
|
}
|
||||||
|
var transitions = state.transitions;
|
||||||
|
while (transitions.length > 0) {
|
||||||
|
var epsilonTrans = [];
|
||||||
|
for (var index = 0; index < transitions.length; index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index] instanceof Transition.RuleTransition) {
|
||||||
|
if (transitions[index].ruleIndex === ruleIndex) {
|
||||||
|
followingStates.push(transitions[index].followState);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (!(transitions[index].target instanceof ATNState.RuleStopState)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
return followingStates.filter(Utils_1.Utils.notEmpty);
|
||||||
|
};
|
||||||
|
// Means with this state, parser can make up a complete rule.
|
||||||
|
LSParserATNSimulator.prototype.IsLastStateBeforeRuleStopState = function (state) {
|
||||||
|
var transitions = state.transitions;
|
||||||
|
while (transitions.length > 0) {
|
||||||
|
var epsilonTrans = [];
|
||||||
|
for (var index = 0; index < transitions.length; index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index].target instanceof ATNState.RuleStopState) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
else if (!(transitions[index] instanceof Transition.RuleTransition)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
LSParserATNSimulator.prototype.HasActiveChildrenState = function (state) {
|
||||||
|
var transitions = state.transitions;
|
||||||
|
while (transitions.length > 0) {
|
||||||
|
var epsilonTrans = [];
|
||||||
|
for (var index = 0; index < transitions.length; index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index] instanceof Transition.RuleTransition) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
else if (!(transitions[index].target instanceof ATNState.RuleStopState)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
return LSParserATNSimulator;
|
||||||
|
}(ParserATNSimulator_1.ParserATNSimulator));
|
||||||
|
exports.LSParserATNSimulator = LSParserATNSimulator;
|
||||||
|
//# sourceMappingURL=LSParserATNSimulator.js.map
|
1
dist/language-service/LSParserATNSimulator.js.map
vendored
Normal file
1
dist/language-service/LSParserATNSimulator.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
136
dist/language-service/LanguageService.js
vendored
Normal file
136
dist/language-service/LanguageService.js
vendored
Normal file
@ -0,0 +1,136 @@
|
|||||||
|
"use strict";
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var antlr4 = require("antlr4");
|
||||||
|
var InputStream_1 = require("antlr4/InputStream");
|
||||||
|
var IntervalSet_1 = require("antlr4/IntervalSet");
|
||||||
|
var LSCommonTokenStream_1 = require("./LSCommonTokenStream");
|
||||||
|
var LSErrorListener_1 = require("./LSErrorListener");
|
||||||
|
var LSParserATNSimulator_1 = require("./LSParserATNSimulator");
|
||||||
|
var Utils_1 = require("./Utils");
|
||||||
|
var StateContext = /** @class */ (function () {
|
||||||
|
function StateContext(state, ruleIndex, expectedTokens, ruleStack) {
|
||||||
|
this.State = state;
|
||||||
|
this.RuleIndex = ruleIndex;
|
||||||
|
this.ExpectedTokens = expectedTokens;
|
||||||
|
this.RuleStack = ruleStack;
|
||||||
|
}
|
||||||
|
return StateContext;
|
||||||
|
}());
|
||||||
|
var LanguageService = /** @class */ (function () {
|
||||||
|
function LanguageService(lexerCtr, parserCtr, keywordsDict) {
|
||||||
|
var _this = this;
|
||||||
|
this._lexer = null;
|
||||||
|
this._parser = null;
|
||||||
|
this._keywordsDict = null;
|
||||||
|
this.StatesBeforeEof = {};
|
||||||
|
this.SyntaxErrors = [];
|
||||||
|
this._eofReached = false;
|
||||||
|
this.EofReachedInPredict = false;
|
||||||
|
this._exThrownAfterEofReached = false;
|
||||||
|
this.IsInPredict = false;
|
||||||
|
this.GetExpectedTokenStrs = function () {
|
||||||
|
var intervalSets = new IntervalSet_1.IntervalSet();
|
||||||
|
for (var key in this.StatesBeforeEof) {
|
||||||
|
if (this.StatesBeforeEof.hasOwnProperty(key)) {
|
||||||
|
intervalSets.addSet(this.StatesBeforeEof[key].ExpectedTokens);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
var expectedStrings = [];
|
||||||
|
if (intervalSets.intervals === null) {
|
||||||
|
return expectedStrings;
|
||||||
|
}
|
||||||
|
for (var i = 0; i < intervalSets.intervals.length; i++) {
|
||||||
|
var v = intervalSets.intervals[i];
|
||||||
|
if (v.start < 0) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
for (var j = v.start; j < v.stop; j++) {
|
||||||
|
var tokenString = this._parser._input.tokenSource.symbolicNames[j];
|
||||||
|
if (tokenString != null) {
|
||||||
|
var keyword = this._keywordsDict[tokenString.replace(/^\'|\'$/gi, "")];
|
||||||
|
if (keyword != null) {
|
||||||
|
expectedStrings.push(keyword);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return expectedStrings.filter(Utils_1.Utils.notDuplicate);
|
||||||
|
};
|
||||||
|
this.RecordStateBeforeEof = function () {
|
||||||
|
if (!this.IsInPredict) {
|
||||||
|
this.EofReached = true;
|
||||||
|
if (!this.ExThrownAfterEofReached) {
|
||||||
|
if (this.StatesBeforeEof[this._parser.state] == undefined || this.StatesBeforeEof[this._parser.state] == null) {
|
||||||
|
this.StatesBeforeEof[this._parser.state] = new StateContext(this._parser.state, this._parser._ctx.ruleIndex, this._parser.getExpectedTokens(), this._parser.getRuleInvocationStack());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
this.EofReachedInPredict = true;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
this.RecordErrorStatesBeforeEof = function (states) {
|
||||||
|
var _this = this;
|
||||||
|
if (states.length > 0) {
|
||||||
|
states.forEach(function (state) {
|
||||||
|
if (state != null) {
|
||||||
|
if (_this.StatesBeforeEof[state.stateNumber] == undefined || _this.StatesBeforeEof[state.stateNumber] == null) {
|
||||||
|
_this.StatesBeforeEof[state.stateNumber] = new StateContext(state.stateNumber, state.ruleIndex, _this._parser._interp.atn.nextTokens(state), _this._parser.getRuleInvocationStack());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
};
|
||||||
|
this.AddSyntaxError = function (msg, line, column) {
|
||||||
|
var error = {
|
||||||
|
line: line,
|
||||||
|
column: column,
|
||||||
|
Message: msg
|
||||||
|
};
|
||||||
|
_this.SyntaxErrors.push(error);
|
||||||
|
if (_this._eofReached) {
|
||||||
|
_this._exThrownAfterEofReached = true;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
this._lexerCtr = lexerCtr;
|
||||||
|
this._parserCtr = parserCtr;
|
||||||
|
this._keywordsDict = keywordsDict;
|
||||||
|
}
|
||||||
|
LanguageService.prototype._parse = function (input) {
|
||||||
|
var _this = this;
|
||||||
|
this.PrepareParse();
|
||||||
|
this._lexer = new this._lexerCtr(new InputStream_1.InputStream(input));
|
||||||
|
this._parser = new this._parserCtr(new LSCommonTokenStream_1.LSCommonTokenStream(this._lexer));
|
||||||
|
this._parser.getTokenStream().EofListener = function () {
|
||||||
|
_this.RecordStateBeforeEof();
|
||||||
|
};
|
||||||
|
this._parser.removeErrorListeners();
|
||||||
|
this._parser.addErrorListener(new LSErrorListener_1.LSErrorListener(function (msg, line, column) {
|
||||||
|
_this.AddSyntaxError(msg, line, column);
|
||||||
|
}));
|
||||||
|
var decisionsToDFA = this._parser.atn.decisionToState.map(function (ds, index) { return new antlr4.dfa.DFA(ds, index); });
|
||||||
|
this._parser._interp = new LSParserATNSimulator_1.LSParserATNSimulator(this._parser, this._parser.atn, decisionsToDFA, new antlr4.PredictionContextCache(), this);
|
||||||
|
this._parser.root();
|
||||||
|
};
|
||||||
|
LanguageService.prototype.PrepareParse = function () {
|
||||||
|
this._eofReached = false;
|
||||||
|
this.EofReachedInPredict = false;
|
||||||
|
this._exThrownAfterEofReached = false;
|
||||||
|
this.StatesBeforeEof = {};
|
||||||
|
this.SyntaxErrors = [];
|
||||||
|
};
|
||||||
|
LanguageService.prototype.getCompletionWords = function (input) {
|
||||||
|
this._parse(input);
|
||||||
|
return this.GetExpectedTokenStrs();
|
||||||
|
};
|
||||||
|
LanguageService.prototype.getSyntaxErrors = function (input) {
|
||||||
|
this._parse(input);
|
||||||
|
return this.SyntaxErrors;
|
||||||
|
};
|
||||||
|
return LanguageService;
|
||||||
|
}());
|
||||||
|
exports.LanguageService = LanguageService;
|
||||||
|
//# sourceMappingURL=LanguageService.js.map
|
1
dist/language-service/LanguageService.js.map
vendored
Normal file
1
dist/language-service/LanguageService.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"LanguageService.js","sourceRoot":"","sources":["../../src/language-service/LanguageService.ts"],"names":[],"mappings":";AAAA,+EAA+E;AAC/E,6DAA6D;AAC7D,+EAA+E;;AAE/E,+BAAiC;AAEjC,kDAAiD;AACjD,kDAAiD;AAEjD,6DAA4D;AAC5D,qDAAoD;AACpD,+DAA8D;AAE9D,iCAAgC;AAYhC;IASI,sBAAY,KAAc,EAAE,SAAkB,EAAE,cAA4B,EAAE,SAAoB;QAC9F,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;QACnB,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAC3B,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;QACrC,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;IAC/B,CAAC;IACL,mBAAC;AAAD,CAAC,AAfD,IAeC;AAED;IAoBI,yBAAY,QAAgB,EAAE,SAAkB,EAAE,YAA0C;QAA5F,iBAIC;QArBO,WAAM,GAAW,IAAI,CAAC;QACtB,YAAO,GAAY,IAAI,CAAC;QAExB,kBAAa,GAAgC,IAAI,CAAC;QAEnD,oBAAe,GAAsB,EAAE,CAAC;QAExC,iBAAY,GAAqB,EAAE,CAAC;QAEnC,gBAAW,GAAa,KAAK,CAAC;QAE/B,wBAAmB,GAAa,KAAK,CAAC;QAErC,6BAAwB,GAAa,KAAK,CAAC;QAE5C,gBAAW,GAAa,KAAK,CAAC;QA6B9B,yBAAoB,GAAG;YAC1B,IAAI,YAAY,GAAG,IAAI,yBAAW,EAAE,CAAC;YACrC,KAAK,IAAI,GAAG,IAAI,IAAI,CAAC,eAAe,EAAE;gBAClC,IAAI,IAAI,CAAC,eAAe,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;oBAC1C,YAAY,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,GAAG,CAAC,CAAC,cAAc,CAAC,CAAC;iBACjE;aACJ;YAED,IAAI,eAAe,GAAG,EAAE,CAAC;YACzB,IAAI,YAAY,CAAC,SAAS,KAAK,IAAI,EAAE;gBACjC,OAAO,eAAe,CAAC;aAC1B;YAED,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,YAAY,CAAC,SAAS,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBACpD,IAAI,CAAC,GAAG,YAAY,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC;gBAClC,IAAI,CAAC,CAAC,KAAK,GAAG,CAAC,EAAE;oBACb,SAAS;iBACZ;gBAED,KAAK,IAAI,CAAC,GAAG,CAAC,CAAC,KAAK,EAAE,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,EAAE,EAAE;oBACnC,IAAI,WAAW,GAAG,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC,WAAW,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC;oBACnE,IAAI,WAAW,IAAI,IAAI,EAAE;wBACrB,IAAI,OAAO,GAAG,IAAI,CAAC,aAAa,CAAC,WAAW,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,CAAC,CAAC,CAAC;wBACvE,IAAI,OAAO,IAAI,IAAI,EAAE;4BACjB,eAAe,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;yBACjC;qBACJ;iBACJ;aACJ;YAED,OAAO,eAAe,CAAC,MAAM,CAAC,aAAK,CAAC,YAAY,CAAC,CAAC;QACtD,CAAC,CAAA;QAEM,yBAAoB,GAAG;YAC1B,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE;gBACnB,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC;gBACvB,IAAI,CAAC,IAAI,CAAC,uBAAuB,EAAE;oBAC/B,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,IAAI,SAAS,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,IAAI,IAAI,EAAE;wBAC3G,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,YAAY,CAAC,IAAI,CAAC,OAAO,CAAC,KAAK,EAAE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE,EAAE,IAAI,CAAC,OAAO,CAAC,sBAAsB,EAAE,CAAC,CAAC;qBACzL;iBACJ;aACJ;iBAAM;gBACH,IAAI,CAAC,mBAAmB,GAAG,IAAI,CAAC;aACnC;QACL,CAAC,CAAA;QAEM,+BAA0B,GAAG,UAAS,MAA4B;YAArC,iBAUnC;YATG,IAAI,MAAM,CAAC,MAAM,GAAG,CAAC,EAAE;gBACnB,MAAM,CAAC,OAAO,CAAC,UAAA,KAAK;oBAChB,IAAI,KAAK,IAAI,IAAI,EAAE;wBACf,IAAI,KAAI,CAAC,eAAe,CAAC,KAAK,CAAC,WAAW,CAAC,IAAI,SAAS,IAAI,KAAI,CAAC,eAAe,CAAC,KAAK,CAAC,WAAW,CAAC,IAAI,IAAI,EAAE;4BACzG,KAAI,CAAC,eAAe,CAAC,KAAK,CAAC,WAAW,CAAC,GAAG,IAAI,YAAY,CAAC,KAAK,CAAC,WAAW,EAAE,KAAK,CAAC,SAAS,EAAE,KAAI,CAAC,OAAO,CAAC,OAAO,CAAC,GAAG,CAAC,UAAU,CAAC,KAAK,CAAC,EAAE,KAAI,CAAC,OAAO,CAAC,sBAAsB,EAAE,CAAC,CAAC;yBACrL;qBACJ;gBACL,CAAC,CAAC,CAAC;aACN;QACL,CAAC,CAAA;QAEM,mBAAc,GAAG,UAAC,GAAY,EAAE,IAAa,EAAE,MAAe;YACjE,IAAI,KAAK,GAAmB;gBACxB,IAAI,EAAG,IAAI;gBACX,MAAM,EAAG,MAAM;gBACf,OAAO,EAAG,GAAG;aAChB,CAAC;YAEF,KAAI,CAAC,YAAY,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;YAE9B,IAAI,KAAI,CAAC,WAAW,EAAE;gBAClB,KAAI,CAAC,wBAAwB,GAAG,IAAI,CAAC;aACxC;QACL,CAAC,CAAA;QAhGG,IAAI,CAAC,SAAS,GAAG,QAAQ,CAAC;QAC1B,IAAI,CAAC,UAAU,GAAG,SAAS,CAAC;QAC5B,IAAI,CAAC,aAAa,GAAG,YAAY,CAAC;IACtC,CAAC;IAEO,gCAAM,GAAd,UAAe,KAAc;QAA7B,iBAmBC;QAlBG,IAAI,CAAC,YAAY,EAAE,CAAC;QACpB,IAAI,CAAC,MAAM,GAAG,IAAI,IAAI,CAAC,SAAS,CAAC,IAAI,yBAAW,CAAC,KAAK,CAAC,CAAC,CAAC;QACzD,IAAI,CAAC,OAAO,GAAG,IAAI,IAAI,CAAC,UAAU,CAAC,IAAI,yCAAmB,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC;QAEzE,IAAI,CAAC,OAAO,CAAC,cAAc,EAAE,CAAC,WAAW,GAAG;YACxC,KAAI,CAAC,oBAAoB,EAAE,CAAC;QAChC,CAAC,CAAC;QAEF,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE,CAAC;QACpC,IAAI,CAAC,OAAO,CAAC,gBAAgB,CAAC,IAAI,iCAAe,CAC7C,UAAC,GAAG,EAAE,IAAI,EAAE,MAAM;YACd,KAAI,CAAC,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE,MAAM,CAAC,CAAC;QAC3C,CAAC,CACJ,CAAC,CAAC;QAEH,IAAI,cAAc,GAAG,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,eAAe,CAAC,GAAG,CAAC,UAAC,EAAE,EAAE,KAAK,IAAO,OAAO,IAAI,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,EAAE,KAAK,CAAC,CAAC,CAAA,CAAC,CAAC,CAAC;QACnH,IAAI,CAAC,OAAO,CAAC,OAAO,GAAG,IAAI,2CAAoB,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,CAAC,GAAG,EAAE,cAAc,EAAE,IAAI,MAAM,CAAC,sBAAsB,EAAE,EAAE,IAAI,CAAC,CAAC;QAC3I,IAAI,CAAC,OAAO,CAAC,IAAI,EAAE,CAAC;IACxB,CAAC;IA0EM,sCAAY,GAAnB;QACI,IAAI,CAAC,WAAW,GAAG,KAAK,CAAC;QACzB,IAAI,CAAC,mBAAmB,GAAG,KAAK,CAAC;QACjC,IAAI,CAAC,wBAAwB,GAAG,KAAK,CAAC;QACtC,IAAI,CAAC,eAAe,GAAG,EAAE,CAAC;QAC1B,IAAI,CAAC,YAAY,GAAG,EAAE,CAAC;IAC3B,CAAC;IAEM,4CAAkB,GAAzB,UAA0B,KAAc;QACpC,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;QACnB,OAAO,IAAI,CAAC,oBAAoB,EAAE,CAAC;IACvC,CAAC;IAEM,yCAAe,GAAtB,UAAuB,KAAc;QACjC,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;QACnB,OAAO,IAAI,CAAC,YAAY,CAAC;IAC7B,CAAC;IACL,sBAAC;AAAD,CAAC,AAxID,IAwIC;AAxIY,0CAAe"}
|
18
dist/language-service/Utils.js
vendored
Normal file
18
dist/language-service/Utils.js
vendored
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
"use strict";
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var Utils = /** @class */ (function () {
|
||||||
|
function Utils() {
|
||||||
|
}
|
||||||
|
Utils.notEmpty = function (value) {
|
||||||
|
return value !== null && value !== undefined;
|
||||||
|
};
|
||||||
|
Utils.notDuplicate = function (item, pos, self) {
|
||||||
|
return self.indexOf(item) == pos;
|
||||||
|
};
|
||||||
|
return Utils;
|
||||||
|
}());
|
||||||
|
exports.Utils = Utils;
|
||||||
|
//# sourceMappingURL=Utils.js.map
|
1
dist/language-service/Utils.js.map
vendored
Normal file
1
dist/language-service/Utils.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"Utils.js","sourceRoot":"","sources":["../../src/language-service/Utils.ts"],"names":[],"mappings":";AAAA,+EAA+E;AAC/E,6DAA6D;AAC7D,+EAA+E;;AAE/E;IAAA;IAQA,CAAC;IAPiB,cAAQ,GAAtB,UAA+B,KAAiC;QAC5D,OAAO,KAAK,KAAK,IAAI,IAAI,KAAK,KAAK,SAAS,CAAC;IACjD,CAAC;IAEa,kBAAY,GAA1B,UAA2B,IAAI,EAAE,GAAG,EAAE,IAAI;QACtC,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,GAAG,CAAC;IACrC,CAAC;IACL,YAAC;AAAD,CAAC,AARD,IAQC;AARY,sBAAK"}
|
13
dist/providers/ErrorMarkProvider.js
vendored
Normal file
13
dist/providers/ErrorMarkProvider.js
vendored
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var LanguageServiceFacade_1 = require("../facade/LanguageServiceFacade");
|
||||||
|
var ErrorMarkProvider = /** @class */ (function () {
|
||||||
|
function ErrorMarkProvider() {
|
||||||
|
}
|
||||||
|
ErrorMarkProvider.getErrorMark = function (input) {
|
||||||
|
return LanguageServiceFacade_1.LanguageServiceFacade.GetLanguageServiceParseResult(input, LanguageServiceFacade_1.ParseReason.GetErrors);
|
||||||
|
};
|
||||||
|
return ErrorMarkProvider;
|
||||||
|
}());
|
||||||
|
exports.ErrorMarkProvider = ErrorMarkProvider;
|
||||||
|
//# sourceMappingURL=ErrorMarkProvider.js.map
|
1
dist/providers/ErrorMarkProvider.js.map
vendored
Normal file
1
dist/providers/ErrorMarkProvider.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"ErrorMarkProvider.js","sourceRoot":"","sources":["../../src/providers/ErrorMarkProvider.ts"],"names":[],"mappings":";;AAAA,yEAAsF;AAItF;IAAA;IAIA,CAAC;IAHiB,8BAAY,GAA1B,UAA2B,KAAa;QACpC,OAAO,6CAAqB,CAAC,6BAA6B,CAAC,KAAK,EAAE,mCAAW,CAAC,SAAS,CAAC,CAAC;IAC7F,CAAC;IACL,wBAAC;AAAD,CAAC,AAJD,IAIC;AAJY,8CAAiB"}
|
22
dist/providers/SqlCompletionItemProvider.js
vendored
Normal file
22
dist/providers/SqlCompletionItemProvider.js
vendored
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var LanguageServiceFacade_1 = require("../facade/LanguageServiceFacade");
|
||||||
|
var SqlCompletionItemProvider = /** @class */ (function () {
|
||||||
|
function SqlCompletionItemProvider() {
|
||||||
|
this.triggerCharacters = [" ", "."];
|
||||||
|
}
|
||||||
|
SqlCompletionItemProvider.prototype.provideCompletionItems = function (model, position, token) {
|
||||||
|
var range = {
|
||||||
|
startLineNumber: 1,
|
||||||
|
startColumn: 1,
|
||||||
|
endLineNumber: position.lineNumber,
|
||||||
|
endColumn: position.column
|
||||||
|
};
|
||||||
|
var text = model.getValueInRange(range);
|
||||||
|
text = this.triggerCharacters.indexOf(text.charAt(text.length - 1)) < 0 ? text.substring(0, text.length - 1) : text;
|
||||||
|
return LanguageServiceFacade_1.LanguageServiceFacade.GetLanguageServiceParseResult(text, LanguageServiceFacade_1.ParseReason.GetCompletionWords);
|
||||||
|
};
|
||||||
|
return SqlCompletionItemProvider;
|
||||||
|
}());
|
||||||
|
exports.SqlCompletionItemProvider = SqlCompletionItemProvider;
|
||||||
|
//# sourceMappingURL=SqlCompletionItemProvider.js.map
|
1
dist/providers/SqlCompletionItemProvider.js.map
vendored
Normal file
1
dist/providers/SqlCompletionItemProvider.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"SqlCompletionItemProvider.js","sourceRoot":"","sources":["../../src/providers/SqlCompletionItemProvider.ts"],"names":[],"mappings":";;AAAA,yEAAqF;AAGrF;IAAA;QACW,sBAAiB,GAAa,CAAC,GAAG,EAAC,GAAG,CAAC,CAAC;IAcnD,CAAC;IAZG,0DAAsB,GAAtB,UAAuB,KAAmC,EAAE,QAAyB,EAAE,KAA+B;QAClH,IAAM,KAAK,GAAG;YACV,eAAe,EAAE,CAAC;YAClB,WAAW,EAAE,CAAC;YACd,aAAa,EAAE,QAAQ,CAAC,UAAU;YAClC,SAAS,EAAE,QAAQ,CAAC,MAAM;SAC7B,CAAA;QAED,IAAI,IAAI,GAAG,KAAK,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;QACxC,IAAI,GAAG,IAAI,CAAC,iBAAiB,CAAC,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,EAAE,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACpH,OAAO,6CAAqB,CAAC,6BAA6B,CAAC,IAAI,EAAE,mCAAW,CAAC,kBAAkB,CAAC,CAAC;IACrG,CAAC;IACL,gCAAC;AAAD,CAAC,AAfD,IAeC;AAfY,8DAAyB"}
|
30
dist/worker/LanguageServiceWorker.js
vendored
Normal file
30
dist/worker/LanguageServiceWorker.js
vendored
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
var LanguageService_1 = require("../language-service/LanguageService");
|
||||||
|
var CosmosDBSqlLexer_1 = require("../cosmosdb-sql/generated/CosmosDBSqlLexer");
|
||||||
|
var CosmosDBSqlParser_1 = require("../cosmosdb-sql/generated/CosmosDBSqlParser");
|
||||||
|
var CosmosDBSqlKeywords_1 = require("../cosmosdb-sql/grammar/CosmosDBSqlKeywords");
|
||||||
|
var ParseReason;
|
||||||
|
(function (ParseReason) {
|
||||||
|
ParseReason[ParseReason["GetCompletionWords"] = 1] = "GetCompletionWords";
|
||||||
|
ParseReason[ParseReason["GetErrors"] = 2] = "GetErrors";
|
||||||
|
})(ParseReason || (ParseReason = {}));
|
||||||
|
var LanguageServiceWorker;
|
||||||
|
(function (LanguageServiceWorker) {
|
||||||
|
// Respond to message from parent thread
|
||||||
|
onmessage = function (event) {
|
||||||
|
var code = event.data.code;
|
||||||
|
var reason = event.data.reason;
|
||||||
|
var parseResults = [];
|
||||||
|
var languageService = new LanguageService_1.LanguageService(CosmosDBSqlLexer_1.CosmosDBSqlLexer, CosmosDBSqlParser_1.CosmosDBSqlParser, CosmosDBSqlKeywords_1.CosmosDBSqlKeywords.keywordsRegisteredForCompletion);
|
||||||
|
if (reason == ParseReason.GetCompletionWords) {
|
||||||
|
parseResults = languageService.getCompletionWords(code);
|
||||||
|
}
|
||||||
|
else if (reason == ParseReason.GetErrors) {
|
||||||
|
parseResults = languageService.getSyntaxErrors(code);
|
||||||
|
}
|
||||||
|
postMessage(parseResults, undefined);
|
||||||
|
close();
|
||||||
|
};
|
||||||
|
})(LanguageServiceWorker = exports.LanguageServiceWorker || (exports.LanguageServiceWorker = {}));
|
||||||
|
//# sourceMappingURL=LanguageServiceWorker.js.map
|
1
dist/worker/LanguageServiceWorker.js.map
vendored
Normal file
1
dist/worker/LanguageServiceWorker.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"LanguageServiceWorker.js","sourceRoot":"","sources":["../../src/worker/LanguageServiceWorker.ts"],"names":[],"mappings":";;AAAA,uEAAsE;AACtE,+EAA8E;AAC9E,iFAAgF;AAChF,mFAAkF;AAElF,IAAK,WAGJ;AAHD,WAAK,WAAW;IACZ,yEAAsB,CAAA;IACtB,uDAAa,CAAA;AACjB,CAAC,EAHI,WAAW,KAAX,WAAW,QAGf;AAED,IAAc,qBAAqB,CAmBlC;AAnBD,WAAc,qBAAqB;IAC/B,wCAAwC;IACxC,SAAS,GAAG,UAAC,KAAmB;QAC5B,IAAM,IAAI,GAAW,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC;QACrC,IAAM,MAAM,GAAY,KAAK,CAAC,IAAI,CAAC,MAAM,CAAC;QAE1C,IAAI,YAAY,GAAG,EAAE,CAAC;QAEtB,IAAI,eAAe,GAAG,IAAI,iCAAe,CAAC,mCAAgB,EAAE,qCAAiB,EAAE,yCAAmB,CAAC,+BAA+B,CAAC,CAAC;QAEpI,IAAI,MAAM,IAAI,WAAW,CAAC,kBAAkB,EAAE;YAC1C,YAAY,GAAG,eAAe,CAAC,kBAAkB,CAAC,IAAI,CAAC,CAAC;SAC3D;aAAM,IAAI,MAAM,IAAI,WAAW,CAAC,SAAS,EAAE;YACxC,YAAY,GAAG,eAAe,CAAC,eAAe,CAAC,IAAI,CAAC,CAAC;SACxD;QAED,WAAW,CAAC,YAAY,EAAE,SAAS,CAAC,CAAC;QACrC,KAAK,EAAE,CAAC;IACZ,CAAC,CAAC;AACN,CAAC,EAnBa,qBAAqB,GAArB,6BAAqB,KAArB,6BAAqB,QAmBlC"}
|
29
dist/worker/webpack.config.js
vendored
Normal file
29
dist/worker/webpack.config.js
vendored
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
"use strict";
|
||||||
|
module.exports = {
|
||||||
|
entry: {
|
||||||
|
LanguageServiceWorker: './LanguageServiceWorker.ts'
|
||||||
|
},
|
||||||
|
mode: 'production',
|
||||||
|
target: 'web',
|
||||||
|
module: {
|
||||||
|
rules: [
|
||||||
|
{
|
||||||
|
test: /\.ts$/,
|
||||||
|
use: 'ts-loader',
|
||||||
|
exclude: /node_modules/
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
resolve: {
|
||||||
|
extensions: ['.ts', '.js']
|
||||||
|
},
|
||||||
|
output: {
|
||||||
|
globalObject: 'this',
|
||||||
|
path: __dirname + "/dist",
|
||||||
|
filename: '[name].js',
|
||||||
|
library: '[name]',
|
||||||
|
libraryTarget: 'umd'
|
||||||
|
},
|
||||||
|
node: { fs: "empty" }
|
||||||
|
};
|
||||||
|
//# sourceMappingURL=webpack.config.js.map
|
1
dist/worker/webpack.config.js.map
vendored
Normal file
1
dist/worker/webpack.config.js.map
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
{"version":3,"file":"webpack.config.js","sourceRoot":"","sources":["../../src/worker/webpack.config.js"],"names":[],"mappings":"AAAA,YAAY,CAAC;AAEb,MAAM,CAAC,OAAO,GAAG;IACb,KAAK,EAAE;QACH,qBAAqB,EAAE,4BAA4B;KACtD;IACD,IAAI,EAAE,YAAY;IAClB,MAAM,EAAE,KAAK;IACb,MAAM,EAAE;QACJ,KAAK,EAAE;YACH;gBACI,IAAI,EAAE,OAAO;gBACb,GAAG,EAAE,WAAW;gBAChB,OAAO,EAAE,cAAc;aAC1B;SACJ;KACJ;IACD,OAAO,EAAE;QACL,UAAU,EAAE,CAAC,KAAK,EAAE,KAAK,CAAC;KAC7B;IACD,MAAM,EAAE;QACJ,YAAY,EAAE,MAAM;QACpB,IAAI,EAAE,SAAS,GAAG,OAAO;QACzB,QAAQ,EAAE,WAAW;QACrB,OAAO,EAAE,QAAQ;QACjB,aAAa,EAAE,KAAK;KACvB;IACD,IAAI,EAAE,EAAE,EAAE,EAAE,OAAO,EAAE;CACxB,CAAC"}
|
13
index.d.ts
vendored
Normal file
13
index.d.ts
vendored
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
import {editor, Position, CancellationToken} from "monaco-editor";
|
||||||
|
|
||||||
|
declare class SqlCompletionItemProvider {
|
||||||
|
public triggerCharacters: string[];
|
||||||
|
provideCompletionItems(model: editor.IReadOnlyModel, position: Position, token: CancellationToken);
|
||||||
|
}
|
||||||
|
|
||||||
|
declare class ErrorMarkProvider {
|
||||||
|
public static getErrorMark(input: string): Q.Promise<editor.IMarkerData[]>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export {SqlCompletionItemProvider, ErrorMarkProvider};
|
||||||
|
|
4
index.js
Normal file
4
index.js
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
"use-strict"
|
||||||
|
|
||||||
|
exports.SqlCompletionItemProvider = require("./dist/providers/SqlCompletionItemProvider").SqlCompletionItemProvider;
|
||||||
|
exports.ErrorMarkProvider = require("./dist/providers/ErrorMarkProvider").ErrorMarkProvider;
|
1442
package-lock.json
generated
Normal file
1442
package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
30
package.json
Normal file
30
package.json
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
{
|
||||||
|
"author": "Microsoft Corporation",
|
||||||
|
"name": "@azure/cosmos-language-service",
|
||||||
|
"version": "0.0.1",
|
||||||
|
"description": "Cosmos DB SQL Language Service for the Monaco editor",
|
||||||
|
"dependencies": {
|
||||||
|
"antlr4": "4.7.1",
|
||||||
|
"monaco-editor": "0.14.3",
|
||||||
|
"q": "1.5.1",
|
||||||
|
"react": "16.5.2"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/react": "16.4.6",
|
||||||
|
"ts-loader": "5.1.1",
|
||||||
|
"ts-node": "7.0.0",
|
||||||
|
"typescript": "3.0.3"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"dev": "tsc"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/Azure/cosmos-sql-language-service"
|
||||||
|
},
|
||||||
|
"keywords": [
|
||||||
|
"Cosmos DB",
|
||||||
|
"Data",
|
||||||
|
"Language Service"
|
||||||
|
]
|
||||||
|
}
|
1070
src/cosmosdb-sql/generated/CosmosDBSqlLexer.js
Normal file
1070
src/cosmosdb-sql/generated/CosmosDBSqlLexer.js
Normal file
File diff suppressed because it is too large
Load Diff
4632
src/cosmosdb-sql/generated/CosmosDBSqlParser.js
Normal file
4632
src/cosmosdb-sql/generated/CosmosDBSqlParser.js
Normal file
File diff suppressed because it is too large
Load Diff
124
src/cosmosdb-sql/grammar/CosmosDBSqlKeywords.ts
Normal file
124
src/cosmosdb-sql/grammar/CosmosDBSqlKeywords.ts
Normal file
@ -0,0 +1,124 @@
|
|||||||
|
export class CosmosDBSqlKeywords{
|
||||||
|
|
||||||
|
private static readonly KeywordTypeHintPrefix : string = "KeywordTypeHint:";
|
||||||
|
|
||||||
|
public static keywordsRegisteredForCompletion : { [key : string] : string } =
|
||||||
|
{
|
||||||
|
"AND": "AND",
|
||||||
|
"ARRAY": "ARRAY",
|
||||||
|
"AS": "AS",
|
||||||
|
"ASC": "ASC",
|
||||||
|
"BETWEEN": "BETWEEN",
|
||||||
|
"BY": "BY",
|
||||||
|
"CASE": "CASE",
|
||||||
|
"CAST": "CAST",
|
||||||
|
"CONVERT": "CONVERT",
|
||||||
|
"CROSS": "CROSS",
|
||||||
|
"DESC": "DESC",
|
||||||
|
"DISTINCT": "DISTINCT",
|
||||||
|
"ELSE": "ELSE",
|
||||||
|
"END": "END",
|
||||||
|
"ESCAPE": "ESCAPE",
|
||||||
|
"EXISTS": "EXISTS",
|
||||||
|
"K_false": "false",
|
||||||
|
"FOR": "FOR",
|
||||||
|
"FROM": "FROM",
|
||||||
|
"GROUP": "GROUP",
|
||||||
|
"HAVING": "HAVING",
|
||||||
|
"IN": "IN",
|
||||||
|
"INNER": "INNER",
|
||||||
|
"INSERT": "INSERT",
|
||||||
|
"INTO": "INTO",
|
||||||
|
"IS": "IS",
|
||||||
|
"JOIN": "JOIN",
|
||||||
|
"LEFT": "LEFT",
|
||||||
|
"LIKE": "LIKE",
|
||||||
|
"LIMIT": "LIMIT",
|
||||||
|
"NOT": "NOT",
|
||||||
|
"K_null": "null",
|
||||||
|
"OFFSET": "OFFSET",
|
||||||
|
"ON": "ON",
|
||||||
|
"OR": "OR",
|
||||||
|
"ORDER": "ORDER",
|
||||||
|
"OUTER": "OUTER",
|
||||||
|
"OVER": "OVER",
|
||||||
|
"RIGHT": "RIGHT",
|
||||||
|
"SELECT": "SELECT",
|
||||||
|
"SET": "SET",
|
||||||
|
"THEN": "THEN",
|
||||||
|
"TOP": "TOP",
|
||||||
|
"K_true": "true",
|
||||||
|
"K_udf": "udf",
|
||||||
|
"K_undefined": "undefined",
|
||||||
|
"UPDATE": "UPDATE",
|
||||||
|
"VALUE": "VALUE",
|
||||||
|
"WHEN": "WHEN",
|
||||||
|
"WHERE": "WHERE",
|
||||||
|
"WITH": "WITH",
|
||||||
|
"Infinity": "Infinity",
|
||||||
|
"NaN": "NaN",
|
||||||
|
|
||||||
|
"ABS": "ABS",
|
||||||
|
"ACOS": "ACOS",
|
||||||
|
"ARRAY_CONCAT": "ARRAY_CONCAT",
|
||||||
|
"ARRAY_CONTAINS": "ARRAY_CONTAINS",
|
||||||
|
"ARRAY_LENGTH": "ARRAY_LENGTH",
|
||||||
|
"ARRAY_SLICE": "ARRAY_SLICE",
|
||||||
|
"ASIN": "ASIN",
|
||||||
|
"ATAN": "ATAN",
|
||||||
|
"ATN2": "ATN2",
|
||||||
|
"AVG": "AVG",
|
||||||
|
"CEILING": "CEILING",
|
||||||
|
"CONCAT": "CONCAT",
|
||||||
|
"CONTAINS": "CONTAINS",
|
||||||
|
"COS": "COS",
|
||||||
|
"COT": "COT",
|
||||||
|
"COUNT": "COUNT",
|
||||||
|
"DEGREES": "DEGREES",
|
||||||
|
"ENDSWITH": "ENDSWITH",
|
||||||
|
"EXP": "EXP",
|
||||||
|
"FLOOR": "FLOOR",
|
||||||
|
"INDEX_OF": "INDEX_OF",
|
||||||
|
"S_ARRAY": "S_ARRAY",
|
||||||
|
"IS_BOOL": "IS_BOOL",
|
||||||
|
"IS_DEFINED": "IS_DEFINED",
|
||||||
|
"IS_FINITE_NUMBER": "IS_FINITE_NUMBER",
|
||||||
|
"IS_NULL": "IS_NULL",
|
||||||
|
"IS_NUMBER": "IS_NUMBER",
|
||||||
|
"IS_OBJECT": "IS_OBJECT",
|
||||||
|
"IS_PRIMITIVE": "IS_PRIMITIVE",
|
||||||
|
"IS_STRING": "IS_STRING",
|
||||||
|
"LENGTH": "LENGTH",
|
||||||
|
"LOG10": "LOG10",
|
||||||
|
"LOWER": "LOWER",
|
||||||
|
"LTRIM": "LTRIM",
|
||||||
|
"MAX": "MAX",
|
||||||
|
"MIN": "MIN",
|
||||||
|
"PI": "PI",
|
||||||
|
"POWER": "POWER",
|
||||||
|
"RADIANS": "RADIANS",
|
||||||
|
"RAND": "RAND",
|
||||||
|
"REPLACE": "REPLACE",
|
||||||
|
"REPLICATE": "REPLICATE",
|
||||||
|
"REVERSE": "REVERSE",
|
||||||
|
"ROUND": "ROUND",
|
||||||
|
"RTRIM": "RTRIM",
|
||||||
|
"SIGN": "SIGN",
|
||||||
|
"SIN": "SIN",
|
||||||
|
"SQRT": "SQRT",
|
||||||
|
"SQUARE": "SQUARE",
|
||||||
|
"ST_DISTANCE": "ST_DISTANCE",
|
||||||
|
"ST_INTERSECTS": "ST_INTERSECTS",
|
||||||
|
"ST_ISVALID": "ST_ISVALID",
|
||||||
|
"ST_ISVALIDDETAILED": "ST_ISVALIDDETAILED",
|
||||||
|
"ST_WITHIN": "ST_WITHIN",
|
||||||
|
"STARTSWITH": "STARTSWITH",
|
||||||
|
"SUBSTRING": "SUBSTRING",
|
||||||
|
"SUM": "SUM",
|
||||||
|
"TAN": "TAN",
|
||||||
|
"TRUNC": "TRUNC",
|
||||||
|
"UPPER": "UPPER",
|
||||||
|
"ID": CosmosDBSqlKeywords.KeywordTypeHintPrefix + "ID",
|
||||||
|
"NUMBER": CosmosDBSqlKeywords.KeywordTypeHintPrefix + "NUMBER"
|
||||||
|
};
|
||||||
|
}
|
184
src/cosmosdb-sql/grammar/CosmosDBSqlLexer.g4
Normal file
184
src/cosmosdb-sql/grammar/CosmosDBSqlLexer.g4
Normal file
@ -0,0 +1,184 @@
|
|||||||
|
lexer grammar CosmosDBSqlLexer;
|
||||||
|
|
||||||
|
// keywords
|
||||||
|
|
||||||
|
AND: A N D;
|
||||||
|
ARRAY: A R R A Y;
|
||||||
|
AS: A S;
|
||||||
|
ASC: A S C;
|
||||||
|
BETWEEN: B E T W E E N;
|
||||||
|
BY: B Y;
|
||||||
|
CASE: C A S E;
|
||||||
|
CAST: C A S T;
|
||||||
|
CONVERT: C O N V E R T;
|
||||||
|
CROSS: C R O S S;
|
||||||
|
DESC: D E S C;
|
||||||
|
DISTINCT: D I S T I N C T;
|
||||||
|
ELSE: E L S E;
|
||||||
|
END: E N D;
|
||||||
|
ESCAPE: E S C A P E;
|
||||||
|
EXISTS: E X I S T S;
|
||||||
|
K_false: 'false'; // case sensitive
|
||||||
|
FOR: F O R;
|
||||||
|
FROM: F R O M ;
|
||||||
|
GROUP: G R O U P;
|
||||||
|
HAVING: H A V I N G;
|
||||||
|
IN: I N;
|
||||||
|
INNER: I N N E R;
|
||||||
|
INSERT: I N S E R T;
|
||||||
|
INTO: I N T O;
|
||||||
|
IS: I S;
|
||||||
|
JOIN: J O I N;
|
||||||
|
LEFT: L E F T;
|
||||||
|
LIKE: L I K E;
|
||||||
|
LIMIT: L I M I T;
|
||||||
|
NOT: N O T;
|
||||||
|
K_null: 'null'; // case sensitive
|
||||||
|
OFFSET: O F F S E T;
|
||||||
|
ON: O N;
|
||||||
|
OR: O R;
|
||||||
|
ORDER: O R D E R;
|
||||||
|
OUTER: O U T E R;
|
||||||
|
OVER: O V E R;
|
||||||
|
RIGHT: R I G H T;
|
||||||
|
SELECT: S E L E C T;
|
||||||
|
SET: S E T;
|
||||||
|
THEN: T H E N;
|
||||||
|
TOP: T O P;
|
||||||
|
K_true: 'true'; // case sensitive
|
||||||
|
K_udf: 'udf'; // case sensitive
|
||||||
|
K_undefined: 'undefined'; // case sensitive
|
||||||
|
UPDATE: U P D A T E;
|
||||||
|
VALUE: V A L U E;
|
||||||
|
WHEN: W H E N;
|
||||||
|
WHERE: W H E R E;
|
||||||
|
WITH: W I T H;
|
||||||
|
Infinity: 'Infinity'; // case sensitive
|
||||||
|
NaN: 'NaN'; // case sensitive
|
||||||
|
|
||||||
|
// build-in functions
|
||||||
|
|
||||||
|
ABS: A B S;
|
||||||
|
ACOS: A C O S;
|
||||||
|
ARRAY_CONCAT: A R R A Y '_' C O N C A T;
|
||||||
|
ARRAY_CONTAINS: A R R A Y '_' C O N T A I N S;
|
||||||
|
ARRAY_LENGTH: A R R A Y '_' L E N G T H;
|
||||||
|
ARRAY_SLICE: A R R A Y '_' S L I C E;
|
||||||
|
ASIN: A S I N;
|
||||||
|
ATAN: A T A N;
|
||||||
|
ATN2: A T N '2';
|
||||||
|
AVG: A V G;
|
||||||
|
CEILING: C E I L I N G;
|
||||||
|
CONCAT: C O N C A T;
|
||||||
|
CONTAINS: C O N T A I N S;
|
||||||
|
COS: C O S;
|
||||||
|
COT: C O T;
|
||||||
|
COUNT: C O U N T;
|
||||||
|
DEGREES: D E G R E E S;
|
||||||
|
ENDSWITH: E N D S W I T H;
|
||||||
|
EXP: E X P;
|
||||||
|
FLOOR: F L O O R;
|
||||||
|
INDEX_OF: I N D E X '_' O F;
|
||||||
|
S_ARRAY: I S '_' A R R A Y;
|
||||||
|
IS_BOOL: I S '_' B O O L;
|
||||||
|
IS_DEFINED: I S '_' D E F I N E D;
|
||||||
|
IS_FINITE_NUMBER: I S '_' F I N I T E '_' N U M B E R;
|
||||||
|
IS_NULL: I S '_' N U L L;
|
||||||
|
IS_NUMBER: I S '_' N U M B E R;
|
||||||
|
IS_OBJECT: I S '_' O B J E C T;
|
||||||
|
IS_PRIMITIVE: I S '_' P R I M I T I V E;
|
||||||
|
IS_STRING: I S '_' S T R I N G;
|
||||||
|
LENGTH: L E N G T H;
|
||||||
|
LOG: L O G;
|
||||||
|
LOG10: L O G '1' '0';
|
||||||
|
LOWER: L O W E R;
|
||||||
|
LTRIM: L T R I M;
|
||||||
|
MAX: M A X;
|
||||||
|
MIN: M I N;
|
||||||
|
PI: P I;
|
||||||
|
POWER: P O W E R;
|
||||||
|
RADIANS: R A D I A N S;
|
||||||
|
RAND: R A N D;
|
||||||
|
REPLACE: R E P L A C E;
|
||||||
|
REPLICATE: R E P L I C A T E;
|
||||||
|
REVERSE: R E V E R S E;
|
||||||
|
ROUND: R O U N D;
|
||||||
|
RTRIM: R T R I M;
|
||||||
|
SIGN: S I G N;
|
||||||
|
SIN: S I N;
|
||||||
|
SQRT: S Q R T;
|
||||||
|
SQUARE: S Q U A R E;
|
||||||
|
ST_DISTANCE: S T '_' D I S T A N C E;
|
||||||
|
ST_INTERSECTS: S T '_' I N T E R S E C T S;
|
||||||
|
ST_ISVALID: S T '_' I S V A L I D;
|
||||||
|
ST_ISVALIDDETAILED: S T '_' I S V A L I D D E T A I L E D;
|
||||||
|
ST_WITHIN: S T '_' W I T H I N;
|
||||||
|
STARTSWITH: S T A R T S W I T H;
|
||||||
|
SUBSTRING: S U B S T R I N G;
|
||||||
|
SUM: S U M;
|
||||||
|
TAN: T A N;
|
||||||
|
TRUNC: T R U N C;
|
||||||
|
UPPER: U P P E R;
|
||||||
|
|
||||||
|
// others
|
||||||
|
|
||||||
|
SPACE: [ \t\r\n]+ -> skip;
|
||||||
|
COMMENTS: '-' '-' ~[\t\r\n]+ [\t\r\n] -> skip;
|
||||||
|
|
||||||
|
// keywords type groups
|
||||||
|
ID: [a-zA-Z_][a-zA-Z_0-9]*;
|
||||||
|
NUMBER: [1-9][0-9]*;
|
||||||
|
|
||||||
|
// operators
|
||||||
|
COL: 'C';
|
||||||
|
COMMA: ',';
|
||||||
|
DOT: '.';
|
||||||
|
ADD: '+';
|
||||||
|
SUB: '-';
|
||||||
|
MUL: '*';
|
||||||
|
DIV: '/';
|
||||||
|
MOD: '%';
|
||||||
|
COLON: ':';
|
||||||
|
EQUAL: '=';
|
||||||
|
GREATER: '>';
|
||||||
|
LESS: '<';
|
||||||
|
BIT_NOT_OP: '~';
|
||||||
|
BIT_OR_OP: '|';
|
||||||
|
BIT_AND_OP: '&';
|
||||||
|
BIT_XOR_OP: '^';
|
||||||
|
QUEST: '?';
|
||||||
|
LEFT_BRACE: '{';
|
||||||
|
RIGHT_BRACE: '}';
|
||||||
|
LEFT_BRACKET: '[';
|
||||||
|
RIGHT_BRACKET: ']';
|
||||||
|
LEFT_PARENTHESIS: '(';
|
||||||
|
RIGHT_PARENTHESIS: ')';
|
||||||
|
QUOTE: '"'|'\'';
|
||||||
|
|
||||||
|
|
||||||
|
fragment A : [aA];
|
||||||
|
fragment B : [bB];
|
||||||
|
fragment C : [cC];
|
||||||
|
fragment D : [dD];
|
||||||
|
fragment E : [eE];
|
||||||
|
fragment F : [fF];
|
||||||
|
fragment G : [gG];
|
||||||
|
fragment H : [hH];
|
||||||
|
fragment I : [iI];
|
||||||
|
fragment J : [jJ];
|
||||||
|
fragment K : [kK];
|
||||||
|
fragment L : [lL];
|
||||||
|
fragment M : [mM];
|
||||||
|
fragment N : [nN];
|
||||||
|
fragment O : [oO];
|
||||||
|
fragment P : [pP];
|
||||||
|
fragment Q : [qQ];
|
||||||
|
fragment R : [rR];
|
||||||
|
fragment S : [sS];
|
||||||
|
fragment T : [tT];
|
||||||
|
fragment U : [uU];
|
||||||
|
fragment V : [vV];
|
||||||
|
fragment W : [wW];
|
||||||
|
fragment X : [xX];
|
||||||
|
fragment Y : [yY];
|
||||||
|
fragment Z : [zZ];
|
250
src/cosmosdb-sql/grammar/CosmosDBSqlParser.g4
Normal file
250
src/cosmosdb-sql/grammar/CosmosDBSqlParser.g4
Normal file
@ -0,0 +1,250 @@
|
|||||||
|
parser grammar CosmosDBSqlParser;
|
||||||
|
|
||||||
|
options {
|
||||||
|
tokenVocab = CosmosDBSqlLexer;
|
||||||
|
}
|
||||||
|
|
||||||
|
root
|
||||||
|
: sql_query
|
||||||
|
;
|
||||||
|
|
||||||
|
sql_query
|
||||||
|
: select_clause from_clause? where_clause? orderby_clause?
|
||||||
|
;
|
||||||
|
|
||||||
|
select_clause
|
||||||
|
: SELECT top_spec? selection
|
||||||
|
;
|
||||||
|
|
||||||
|
top_spec
|
||||||
|
: TOP NUMBER
|
||||||
|
;
|
||||||
|
|
||||||
|
from_clause
|
||||||
|
: FROM from_specification
|
||||||
|
;
|
||||||
|
|
||||||
|
where_clause
|
||||||
|
: WHERE scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
orderby_clause
|
||||||
|
: ORDER BY orderby_item_list
|
||||||
|
;
|
||||||
|
|
||||||
|
selection
|
||||||
|
: select_list
|
||||||
|
| select_value_spec
|
||||||
|
| MUL // FIXME 'SELECT *<EOF>' is not supported actually
|
||||||
|
;
|
||||||
|
|
||||||
|
select_value_spec
|
||||||
|
: VALUE scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
select_list
|
||||||
|
: select_item
|
||||||
|
| select_list COMMA select_item
|
||||||
|
;
|
||||||
|
|
||||||
|
select_item
|
||||||
|
: scalar_expression
|
||||||
|
| scalar_expression select_alias
|
||||||
|
;
|
||||||
|
|
||||||
|
select_alias
|
||||||
|
: ID
|
||||||
|
| AS ID
|
||||||
|
;
|
||||||
|
|
||||||
|
orderby_item_list
|
||||||
|
: orderby_item
|
||||||
|
| orderby_item_list COMMA orderby_item
|
||||||
|
;
|
||||||
|
|
||||||
|
orderby_item:
|
||||||
|
scalar_expression
|
||||||
|
| scalar_expression ASC
|
||||||
|
| scalar_expression DESC
|
||||||
|
;
|
||||||
|
|
||||||
|
from_specification
|
||||||
|
: primary_from_specification
|
||||||
|
| from_specification JOIN primary_from_specification
|
||||||
|
;
|
||||||
|
|
||||||
|
primary_from_specification
|
||||||
|
: input_collection
|
||||||
|
| input_collection input_alias
|
||||||
|
| ID IN input_collection
|
||||||
|
;
|
||||||
|
|
||||||
|
input_alias
|
||||||
|
: ID
|
||||||
|
| AS ID
|
||||||
|
;
|
||||||
|
|
||||||
|
input_collection
|
||||||
|
: relative_path
|
||||||
|
| LEFT_PARENTHESIS sql_query RIGHT_PARENTHESIS
|
||||||
|
;
|
||||||
|
|
||||||
|
relative_path
|
||||||
|
: relative_path_segment
|
||||||
|
| relative_path DOT relative_path_segment
|
||||||
|
| relative_path LEFT_BRACKET NUMBER RIGHT_BRACKET
|
||||||
|
| relative_path LEFT_BRACKET QUOTE relative_path_segment QUOTE RIGHT_BRACKET
|
||||||
|
;
|
||||||
|
|
||||||
|
relative_path_segment
|
||||||
|
: ID
|
||||||
|
;
|
||||||
|
|
||||||
|
array_item_list
|
||||||
|
: scalar_expression
|
||||||
|
| array_item_list COMMA scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
array_create_expression
|
||||||
|
: LEFT_BRACKET array_item_list? RIGHT_BRACKET
|
||||||
|
;
|
||||||
|
|
||||||
|
property_name
|
||||||
|
: ID
|
||||||
|
;
|
||||||
|
|
||||||
|
object_property
|
||||||
|
: property_name COLON scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
object_property_list:
|
||||||
|
object_property
|
||||||
|
| object_property_list COMMA object_property
|
||||||
|
;
|
||||||
|
|
||||||
|
object_create_expression
|
||||||
|
: LEFT_BRACE object_property_list? RIGHT_BRACE
|
||||||
|
;
|
||||||
|
|
||||||
|
function_arg_list:
|
||||||
|
scalar_expression
|
||||||
|
| function_arg_list COMMA scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
sys_function_name
|
||||||
|
: ID
|
||||||
|
;
|
||||||
|
|
||||||
|
udf_function_name
|
||||||
|
: ID
|
||||||
|
;
|
||||||
|
|
||||||
|
function_call_expression
|
||||||
|
: sys_function_name LEFT_PARENTHESIS function_arg_list? RIGHT_PARENTHESIS
|
||||||
|
| K_udf DOT udf_function_name LEFT_PARENTHESIS function_arg_list? RIGHT_PARENTHESIS
|
||||||
|
;
|
||||||
|
|
||||||
|
scalar_expression
|
||||||
|
: logical_scalar_expression
|
||||||
|
| between_scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
logical_scalar_expression
|
||||||
|
: binary_expression
|
||||||
|
| in_scalar_expression
|
||||||
|
| logical_scalar_expression AND logical_scalar_expression
|
||||||
|
| logical_scalar_expression OR logical_scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
between_scalar_expression
|
||||||
|
: binary_expression BETWEEN binary_expression AND binary_expression
|
||||||
|
| binary_expression NOT BETWEEN binary_expression AND binary_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
in_scalar_expression
|
||||||
|
: binary_expression IN LEFT_PARENTHESIS in_scalar_expression_item_list RIGHT_PARENTHESIS
|
||||||
|
| binary_expression NOT IN LEFT_PARENTHESIS in_scalar_expression_item_list RIGHT_PARENTHESIS
|
||||||
|
;
|
||||||
|
|
||||||
|
exists_scalar_expression
|
||||||
|
: EXISTS LEFT_PARENTHESIS sql_query RIGHT_PARENTHESIS
|
||||||
|
;
|
||||||
|
|
||||||
|
array_scalar_expression
|
||||||
|
: ARRAY LEFT_PARENTHESIS sql_query RIGHT_PARENTHESIS
|
||||||
|
;
|
||||||
|
|
||||||
|
in_scalar_expression_item_list
|
||||||
|
: scalar_expression
|
||||||
|
| in_scalar_expression_item_list COMMA scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
binary_expression:
|
||||||
|
unary_expression
|
||||||
|
| binary_expression ADD binary_expression
|
||||||
|
| binary_expression SUB binary_expression
|
||||||
|
| binary_expression MUL binary_expression
|
||||||
|
| binary_expression DIV binary_expression
|
||||||
|
| binary_expression MOD binary_expression
|
||||||
|
| binary_expression EQUAL binary_expression
|
||||||
|
| binary_expression LESS binary_expression
|
||||||
|
| binary_expression GREATER binary_expression
|
||||||
|
| binary_expression BIT_AND_OP binary_expression
|
||||||
|
| binary_expression BIT_OR_OP binary_expression
|
||||||
|
| binary_expression BIT_XOR_OP binary_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
unary_expression:
|
||||||
|
primary_expression
|
||||||
|
| SUB unary_expression
|
||||||
|
| ADD unary_expression
|
||||||
|
| BIT_NOT_OP unary_expression
|
||||||
|
| NOT unary_expression;
|
||||||
|
|
||||||
|
primary_expression
|
||||||
|
: constant
|
||||||
|
| input_alias
|
||||||
|
// parameter_name: Represents a value of the specified parameter name. Parameter names must have a single @ as the first character.
|
||||||
|
| array_create_expression
|
||||||
|
| object_create_expression
|
||||||
|
| function_call_expression
|
||||||
|
| LEFT_PARENTHESIS scalar_expression RIGHT_PARENTHESIS
|
||||||
|
| LEFT_PARENTHESIS sql_query RIGHT_PARENTHESIS
|
||||||
|
| primary_expression DOT property_name
|
||||||
|
| primary_expression LEFT_BRACKET scalar_expression RIGHT_BRACKET
|
||||||
|
| exists_scalar_expression
|
||||||
|
| array_scalar_expression
|
||||||
|
;
|
||||||
|
|
||||||
|
constant
|
||||||
|
: K_undefined
|
||||||
|
| K_null
|
||||||
|
| K_true
|
||||||
|
| K_false
|
||||||
|
| NUMBER
|
||||||
|
| QUOTE ID QUOTE
|
||||||
|
| array_constant
|
||||||
|
| object_constant
|
||||||
|
;
|
||||||
|
|
||||||
|
array_constant
|
||||||
|
: LEFT_BRACKET array_constant_list? RIGHT_BRACKET
|
||||||
|
;
|
||||||
|
|
||||||
|
array_constant_list
|
||||||
|
: constant
|
||||||
|
| array_constant_list COMMA constant
|
||||||
|
;
|
||||||
|
|
||||||
|
object_constant
|
||||||
|
: LEFT_BRACE object_constant_items? RIGHT_BRACE
|
||||||
|
;
|
||||||
|
|
||||||
|
object_constant_item
|
||||||
|
: property_name COLON constant
|
||||||
|
;
|
||||||
|
|
||||||
|
object_constant_items
|
||||||
|
: object_constant_item
|
||||||
|
| object_constant_items COMMA object_constant_item
|
||||||
|
;
|
79
src/facade/LanguageServiceFacade.ts
Normal file
79
src/facade/LanguageServiceFacade.ts
Normal file
@ -0,0 +1,79 @@
|
|||||||
|
import * as Q from "q";
|
||||||
|
import {editor, languages, MarkerSeverity} from "monaco-editor";
|
||||||
|
|
||||||
|
export enum ParseReason {
|
||||||
|
GetCompletionWords = 1,
|
||||||
|
GetErrors = 2
|
||||||
|
}
|
||||||
|
|
||||||
|
export class LanguageServiceFacade {
|
||||||
|
private static readonly timeout : number = 2000;
|
||||||
|
|
||||||
|
private static workingWorker : Worker = null;
|
||||||
|
|
||||||
|
public static GetLanguageServiceParseResult(str : string, parseReason : ParseReason) : Q.Promise<any[]> {
|
||||||
|
const timeExceeded = Q.Promise<any[]>((resolve : any, reject : any) => {
|
||||||
|
const wait = setTimeout(() => {
|
||||||
|
const words : any = [];
|
||||||
|
resolve(words);
|
||||||
|
}, LanguageServiceFacade.timeout);
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = LanguageServiceFacade.GetParseResult(str, parseReason);
|
||||||
|
return Q.race([timeExceeded, result]).then(function(words) {
|
||||||
|
LanguageServiceFacade.workingWorker.terminate();
|
||||||
|
return words;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private static GetParseResult = (str : string, parseReason : ParseReason) : Q.Promise<any[]> => {
|
||||||
|
return Q.Promise((resolve : any) => {
|
||||||
|
|
||||||
|
if (LanguageServiceFacade.workingWorker != null) {
|
||||||
|
LanguageServiceFacade.workingWorker.terminate();
|
||||||
|
}
|
||||||
|
|
||||||
|
const currentUrlWithoutQueryParamsAndHashRoute: string = `${window.location.protocol}//${window.location.host}${window.location.pathname}`;
|
||||||
|
let url = currentUrlWithoutQueryParamsAndHashRoute.replace(/\/[^\/]*$/, '/node_modules/cosmosdb-language-service/dist/worker/dist/LanguageServiceWorker.js');
|
||||||
|
LanguageServiceFacade.workingWorker = new Worker(url);
|
||||||
|
|
||||||
|
LanguageServiceFacade.workingWorker.onmessage = (ev : MessageEvent) => {
|
||||||
|
var processedResults: any = [];
|
||||||
|
|
||||||
|
var results : any[] = ev.data;
|
||||||
|
|
||||||
|
if (parseReason === ParseReason.GetCompletionWords) {
|
||||||
|
results.forEach((label: string) => {
|
||||||
|
if (!!label) {
|
||||||
|
processedResults.push({
|
||||||
|
label: label,
|
||||||
|
kind: languages.CompletionItemKind.Keyword
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else if (parseReason === ParseReason.GetErrors) {
|
||||||
|
results.forEach((err: any) => {
|
||||||
|
const mark: editor.IMarkerData = {
|
||||||
|
severity: MarkerSeverity.Error,
|
||||||
|
message: err.Message,
|
||||||
|
startLineNumber: err.line,
|
||||||
|
startColumn: err.column,
|
||||||
|
endLineNumber: err.line,
|
||||||
|
endColumn: err.column
|
||||||
|
};
|
||||||
|
|
||||||
|
processedResults.push(mark)
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
resolve(processedResults);
|
||||||
|
}
|
||||||
|
|
||||||
|
const source = {
|
||||||
|
code : str,
|
||||||
|
reason : parseReason
|
||||||
|
};
|
||||||
|
LanguageServiceFacade.workingWorker.postMessage(source);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
33
src/language-service/LSCommonTokenStream.ts
Normal file
33
src/language-service/LSCommonTokenStream.ts
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
import { CommonTokenStream } from "antlr4/CommonTokenStream";
|
||||||
|
import { Lexer } from "antlr4/Lexer";
|
||||||
|
import { Token } from "antlr4/Token";
|
||||||
|
|
||||||
|
export class LSCommonTokenStream extends CommonTokenStream {
|
||||||
|
public EofListener;
|
||||||
|
|
||||||
|
constructor(tokenSource : Lexer) {
|
||||||
|
super(tokenSource);
|
||||||
|
}
|
||||||
|
|
||||||
|
public LA(i : number) : number {
|
||||||
|
let token : number = super.LA(i);
|
||||||
|
|
||||||
|
if (token != null && token == Token.EOF && this.EofListener != undefined) {
|
||||||
|
this.EofListener();
|
||||||
|
}
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LT(i : number) : any {
|
||||||
|
let token = super.LT(i);
|
||||||
|
|
||||||
|
if (token != null && token.type == Token.EOF && this.EofListener != undefined) {
|
||||||
|
this.EofListener();
|
||||||
|
}
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
}
|
18
src/language-service/LSErrorListener.ts
Normal file
18
src/language-service/LSErrorListener.ts
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
import { ErrorListener } from "antlr4/error/ErrorListener";
|
||||||
|
|
||||||
|
export class LSErrorListener extends ErrorListener {
|
||||||
|
private AddSyntaxError : (msg : string, line : number, column : number) => any;
|
||||||
|
|
||||||
|
constructor(AddSyntaxError : (msg : string, line : number, column : number) => any) {
|
||||||
|
super();
|
||||||
|
this.AddSyntaxError = AddSyntaxError;
|
||||||
|
}
|
||||||
|
|
||||||
|
public syntaxError(recognizer: any, offendingSymbol: any,line: number, column: number, msg: string, e: any): void {
|
||||||
|
this.AddSyntaxError(msg, line, column);
|
||||||
|
}
|
||||||
|
}
|
299
src/language-service/LSParserATNSimulator.ts
Normal file
299
src/language-service/LSParserATNSimulator.ts
Normal file
@ -0,0 +1,299 @@
|
|||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
import * as ATNState from "antlr4/atn/ATNState";
|
||||||
|
import * as Transition from "antlr4/atn/Transition";
|
||||||
|
import { ATN } from "antlr4/atn/ATN";
|
||||||
|
import { CommonTokenStream } from "antlr4/CommonTokenStream";
|
||||||
|
import { DFA } from "antlr4/dfa/DFA";
|
||||||
|
import { LanguageService } from "./LanguageService";
|
||||||
|
import { NoViableAltException } from "antlr4/error/Errors";
|
||||||
|
import { Parser } from "antlr4/Parser";
|
||||||
|
import { ParserATNSimulator } from "antlr4/atn/ParserATNSimulator";
|
||||||
|
import { PredictionContextCache } from "antlr4/PredictionContext";
|
||||||
|
import { PredictionMode } from "antlr4/atn/PredictionMode";
|
||||||
|
import { RuleContext } from "antlr4/RuleContext";
|
||||||
|
import { Token } from "antlr4/Token";
|
||||||
|
import { Utils } from "./Utils";
|
||||||
|
|
||||||
|
interface StateWithTransitionPath {
|
||||||
|
state : ATNState.ATNState,
|
||||||
|
transitionStates : ATNState.ATNState[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export class LSParserATNSimulator extends ParserATNSimulator {
|
||||||
|
private predictionMode = PredictionMode.LL;
|
||||||
|
|
||||||
|
private parser : Parser;
|
||||||
|
|
||||||
|
private atn : ATN;
|
||||||
|
|
||||||
|
private languageService : LanguageService;
|
||||||
|
|
||||||
|
constructor(parser : Parser, atn : ATN, decisionToDFA : Array<DFA>, sharedContextCache : PredictionContextCache, languageService : LanguageService) {
|
||||||
|
super(parser, atn, decisionToDFA, sharedContextCache);
|
||||||
|
this.parser = parser;
|
||||||
|
this.atn = atn;
|
||||||
|
this.languageService = languageService;
|
||||||
|
}
|
||||||
|
|
||||||
|
public adaptivePredict(input : CommonTokenStream, decision : number, outerContext : RuleContext) {
|
||||||
|
let tokensLeft : number = -1;
|
||||||
|
|
||||||
|
try {
|
||||||
|
this.languageService.IsInPredict = true;
|
||||||
|
this.languageService.EofReachedInPredict = false;
|
||||||
|
|
||||||
|
if (decision >= 0) {
|
||||||
|
return super.adaptivePredict(input, decision, outerContext);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch(error) {
|
||||||
|
if (error instanceof NoViableAltException && error.offendingToken.type === Token.EOF) {
|
||||||
|
tokensLeft = error.offendingToken.tokenIndex - this.parser.getCurrentToken().tokenIndex;
|
||||||
|
return 1;
|
||||||
|
} else {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
finally {
|
||||||
|
if (this.languageService.EofReachedInPredict) {
|
||||||
|
if (tokensLeft < 0) {
|
||||||
|
tokensLeft = 0;
|
||||||
|
while (input.LA(tokensLeft + 1) != Token.EOF) {
|
||||||
|
tokensLeft++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (tokensLeft > 0) {
|
||||||
|
let states = this.CalculateValidStates(input, tokensLeft);
|
||||||
|
this.languageService.RecordErrorStatesBeforeEof(states);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.languageService.IsInPredict = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private CalculateValidStates(input : CommonTokenStream, tokensLeft : number): ATNState.ATNState[] {
|
||||||
|
let state = this.atn.states[this.parser.state];
|
||||||
|
let states : StateWithTransitionPath[] = [ {
|
||||||
|
state : state,
|
||||||
|
transitionStates : []
|
||||||
|
} ];
|
||||||
|
let validStates : StateWithTransitionPath[] = [];
|
||||||
|
|
||||||
|
// one step each time. Consume a single token each time.
|
||||||
|
for (let index = 1; index <= tokensLeft; index++) {
|
||||||
|
let _states : StateWithTransitionPath[] = [];
|
||||||
|
let nextToken : number = input.LA(index);
|
||||||
|
|
||||||
|
states.forEach(s => { _states = _states.concat(this.ConsumeSingleTokenAhead(s, nextToken)).filter(Utils.notDuplicate); });
|
||||||
|
states = _states.filter(Utils.notDuplicate);
|
||||||
|
}
|
||||||
|
|
||||||
|
states.forEach(s => { validStates = validStates.concat(this.SearchValidStates(s)); });
|
||||||
|
return validStates.map(s => s.state).filter(Utils.notDuplicate);
|
||||||
|
}
|
||||||
|
|
||||||
|
private ConsumeSingleTokenAhead(stateWithTransitionPath : StateWithTransitionPath, matchToken : Token) : StateWithTransitionPath[] {
|
||||||
|
let validStates : StateWithTransitionPath[] = [];
|
||||||
|
let currentState = stateWithTransitionPath.state;
|
||||||
|
let nextStateWithTransitionPath : StateWithTransitionPath = {
|
||||||
|
state : null, // Temporary null
|
||||||
|
transitionStates : stateWithTransitionPath.transitionStates.slice()
|
||||||
|
};
|
||||||
|
|
||||||
|
if(nextStateWithTransitionPath.transitionStates.length > 0 &&
|
||||||
|
nextStateWithTransitionPath.transitionStates[nextStateWithTransitionPath.transitionStates.length - 1].ruleIndex === currentState.ruleIndex) {
|
||||||
|
nextStateWithTransitionPath.transitionStates.pop();
|
||||||
|
}
|
||||||
|
|
||||||
|
nextStateWithTransitionPath.transitionStates.push(currentState);
|
||||||
|
|
||||||
|
if (!(currentState instanceof ATNState.RuleStopState)) {
|
||||||
|
for (let index = 0;index < currentState.transitions.length;index++) {
|
||||||
|
let transition = currentState.transitions[index];
|
||||||
|
let destinationChildState = transition.target;
|
||||||
|
nextStateWithTransitionPath.state = destinationChildState;
|
||||||
|
|
||||||
|
if (!transition.isEpsilon) {
|
||||||
|
if (transition.label != null && transition.label.contains(matchToken)) {
|
||||||
|
validStates = validStates.concat(this.SearchValidStates(nextStateWithTransitionPath));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
validStates = validStates.concat(this.ConsumeSingleTokenAhead(nextStateWithTransitionPath, matchToken)).filter(Utils.notDuplicate);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return validStates.filter(Utils.notEmpty);
|
||||||
|
}
|
||||||
|
|
||||||
|
private SearchValidStates(stateWithTransitionPath : StateWithTransitionPath) : StateWithTransitionPath[] {
|
||||||
|
let validStates : StateWithTransitionPath[] = [];
|
||||||
|
|
||||||
|
if (!this.IsLastStateBeforeRuleStopState(stateWithTransitionPath.state)) {
|
||||||
|
validStates.push(stateWithTransitionPath);
|
||||||
|
} else {
|
||||||
|
validStates = this.BackTracingAndFindActiveStates(stateWithTransitionPath).filter(Utils.notDuplicate);
|
||||||
|
|
||||||
|
if (this.HasActiveChildrenState(stateWithTransitionPath.state)) {
|
||||||
|
validStates.push(stateWithTransitionPath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return validStates;
|
||||||
|
}
|
||||||
|
|
||||||
|
private BackTracingAndFindActiveStates(stateWithTransitionPath : StateWithTransitionPath) : StateWithTransitionPath[] {
|
||||||
|
let validStates : StateWithTransitionPath[] = [];
|
||||||
|
let completedRuleIndex = stateWithTransitionPath.state.ruleIndex;
|
||||||
|
let statesStack = this.GetLastStateInDifferentRulesFomStatesStack(stateWithTransitionPath.transitionStates, completedRuleIndex);
|
||||||
|
let currentStateIndex = statesStack.length - 1;
|
||||||
|
let keepBackTracing : boolean = true;
|
||||||
|
|
||||||
|
while (keepBackTracing && currentStateIndex >= 0) {
|
||||||
|
let currentState = statesStack[currentStateIndex];
|
||||||
|
keepBackTracing = false;
|
||||||
|
let followingStates = this.GetRuleFollowingState(currentState, completedRuleIndex);
|
||||||
|
|
||||||
|
for (let index = 0;index < followingStates.length; index++) {
|
||||||
|
let lastStateBeforeRuleStopState : boolean = false;
|
||||||
|
let haveActiveChildrenStatesInCurrentRule : boolean = false;
|
||||||
|
let transitions = followingStates[index].transitions;
|
||||||
|
while (transitions.length > 0){
|
||||||
|
let epsilonTrans = [];
|
||||||
|
for (let tIndex = 0;tIndex < transitions.length; tIndex++) {
|
||||||
|
if (transitions[tIndex].isEpsilon) {
|
||||||
|
if (transitions[tIndex] instanceof Transition.RuleTransition) {
|
||||||
|
haveActiveChildrenStatesInCurrentRule = true;
|
||||||
|
} else if (transitions[tIndex].target instanceof ATNState.RuleStopState) {
|
||||||
|
lastStateBeforeRuleStopState = true;
|
||||||
|
} else {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[tIndex].target.transitions);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
haveActiveChildrenStatesInCurrentRule = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
if (lastStateBeforeRuleStopState && haveActiveChildrenStatesInCurrentRule) {
|
||||||
|
// We can jump out of loop ahead of schedule.
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lastStateBeforeRuleStopState) {
|
||||||
|
keepBackTracing = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (haveActiveChildrenStatesInCurrentRule) {
|
||||||
|
//validStates.push(followingStates[index]);
|
||||||
|
let newValidState : StateWithTransitionPath = {
|
||||||
|
state : followingStates[index],
|
||||||
|
transitionStates : statesStack.slice(0, currentStateIndex + 1)
|
||||||
|
};
|
||||||
|
validStates.push(newValidState);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
currentStateIndex--;
|
||||||
|
|
||||||
|
if(keepBackTracing) {
|
||||||
|
completedRuleIndex = followingStates[0].ruleIndex;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return validStates.filter(Utils.notEmpty);
|
||||||
|
}
|
||||||
|
|
||||||
|
private GetLastStateInDifferentRulesFomStatesStack(statesStack : ATNState.ATNState[], lastMatchedRuleIndex : number) : ATNState.ATNState[] {
|
||||||
|
let lastStates : ATNState.ATNState[] = [];
|
||||||
|
let matchedRuleIndex = lastMatchedRuleIndex;
|
||||||
|
for(let currentStateIndex = statesStack.length - 1; currentStateIndex >= 0; currentStateIndex--) {
|
||||||
|
if(statesStack[currentStateIndex].ruleIndex === matchedRuleIndex) {
|
||||||
|
continue;
|
||||||
|
} else {
|
||||||
|
lastStates.push(statesStack[currentStateIndex]);
|
||||||
|
matchedRuleIndex = statesStack[currentStateIndex].ruleIndex;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
lastStates.reverse();
|
||||||
|
return lastStates.filter(Utils.notEmpty);
|
||||||
|
}
|
||||||
|
|
||||||
|
private GetRuleFollowingState(state : ATNState.ATNState, ruleIndex : number) : ATNState.ATNState[] {
|
||||||
|
let followingStates : ATNState.ATNState[] = [];
|
||||||
|
|
||||||
|
if(state instanceof ATNState.RuleStopState) {
|
||||||
|
return followingStates;
|
||||||
|
}
|
||||||
|
|
||||||
|
let transitions = state.transitions;
|
||||||
|
|
||||||
|
while(transitions.length > 0) {
|
||||||
|
let epsilonTrans = [];
|
||||||
|
for (let index = 0;index < transitions.length;index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index] instanceof Transition.RuleTransition) {
|
||||||
|
if (transitions[index].ruleIndex === ruleIndex) {
|
||||||
|
followingStates.push(transitions[index].followState);
|
||||||
|
}
|
||||||
|
} else if (!(transitions[index].target instanceof ATNState.RuleStopState)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
|
||||||
|
return followingStates.filter(Utils.notEmpty);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Means with this state, parser can make up a complete rule.
|
||||||
|
private IsLastStateBeforeRuleStopState(state : ATNState.ATNState) {
|
||||||
|
let transitions = state.transitions;
|
||||||
|
|
||||||
|
while(transitions.length > 0) {
|
||||||
|
let epsilonTrans = [];
|
||||||
|
for (let index = 0;index < transitions.length;index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index].target instanceof ATNState.RuleStopState) {
|
||||||
|
return true;
|
||||||
|
} else if (!(transitions[index] instanceof Transition.RuleTransition)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
private HasActiveChildrenState(state : ATNState.ATNState) : boolean {
|
||||||
|
let transitions = state.transitions;
|
||||||
|
|
||||||
|
while(transitions.length > 0) {
|
||||||
|
let epsilonTrans = [];
|
||||||
|
for (let index = 0;index < transitions.length;index++) {
|
||||||
|
if (transitions[index].isEpsilon) {
|
||||||
|
if (transitions[index] instanceof Transition.RuleTransition) {
|
||||||
|
return true;
|
||||||
|
} else if (!(transitions[index].target instanceof ATNState.RuleStopState)) {
|
||||||
|
epsilonTrans = epsilonTrans.concat(transitions[index].target.transitions);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
transitions = epsilonTrans;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
179
src/language-service/LanguageService.ts
Normal file
179
src/language-service/LanguageService.ts
Normal file
@ -0,0 +1,179 @@
|
|||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
import * as antlr4 from "antlr4";
|
||||||
|
import * as ATNState from "antlr4/atn/ATNState";
|
||||||
|
import { InputStream } from "antlr4/InputStream";
|
||||||
|
import { IntervalSet } from "antlr4/IntervalSet";
|
||||||
|
import { Lexer } from "antlr4/Lexer";
|
||||||
|
import { LSCommonTokenStream } from "./LSCommonTokenStream";
|
||||||
|
import { LSErrorListener } from "./LSErrorListener";
|
||||||
|
import { LSParserATNSimulator } from "./LSParserATNSimulator";
|
||||||
|
import { Parser } from "antlr4/Parser";
|
||||||
|
import { Utils } from "./Utils";
|
||||||
|
|
||||||
|
interface ErrorMarkItem {
|
||||||
|
line: number;
|
||||||
|
column: number;
|
||||||
|
Message: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface StateContextDict {
|
||||||
|
[key : number] : StateContext
|
||||||
|
}
|
||||||
|
|
||||||
|
class StateContext {
|
||||||
|
public State : number;
|
||||||
|
|
||||||
|
public ExpectedTokens : IntervalSet;
|
||||||
|
|
||||||
|
public RuleIndex : number;
|
||||||
|
|
||||||
|
public RuleStack : string[];
|
||||||
|
|
||||||
|
constructor(state : number, ruleIndex : number, expectedTokens : IntervalSet, ruleStack : string[]) {
|
||||||
|
this.State = state;
|
||||||
|
this.RuleIndex = ruleIndex;
|
||||||
|
this.ExpectedTokens = expectedTokens;
|
||||||
|
this.RuleStack = ruleStack;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class LanguageService {
|
||||||
|
private _lexerCtr : any;
|
||||||
|
private _parserCtr : any;
|
||||||
|
private _lexer : Lexer = null;
|
||||||
|
private _parser : Parser = null;
|
||||||
|
|
||||||
|
private _keywordsDict: { [key : string] : string } = null;
|
||||||
|
|
||||||
|
public StatesBeforeEof : StateContextDict = {};
|
||||||
|
|
||||||
|
public SyntaxErrors : ErrorMarkItem[] = [];
|
||||||
|
|
||||||
|
private EofReached : boolean = false;
|
||||||
|
|
||||||
|
public EofReachedInPredict : boolean = false;
|
||||||
|
|
||||||
|
private ExThrownAfterEofReached : boolean = false;
|
||||||
|
|
||||||
|
public IsInPredict : boolean = false;
|
||||||
|
|
||||||
|
constructor(lexerCtr : Lexer, parserCtr : Parser, keywordsDict : { [key : string] : string }) {
|
||||||
|
this._lexerCtr = lexerCtr;
|
||||||
|
this._parserCtr = parserCtr;
|
||||||
|
this._keywordsDict = keywordsDict;
|
||||||
|
}
|
||||||
|
|
||||||
|
private _parse(input : string) {
|
||||||
|
this.PrepareParse();
|
||||||
|
this._lexer = new this._lexerCtr(new InputStream(input));
|
||||||
|
this._parser = new this._parserCtr(new LSCommonTokenStream(this._lexer));
|
||||||
|
|
||||||
|
this._parser.getTokenStream().EofListener = () => {
|
||||||
|
this.RecordStateBeforeEof();
|
||||||
|
};
|
||||||
|
|
||||||
|
this._parser.removeErrorListeners();
|
||||||
|
this._parser.addErrorListener(new LSErrorListener(
|
||||||
|
(msg, line, column) => {
|
||||||
|
this.AddSyntaxError(msg, line, column);
|
||||||
|
}
|
||||||
|
));
|
||||||
|
|
||||||
|
let decisionsToDFA = this._parser.atn.decisionToState.map((ds, index) => { return new antlr4.dfa.DFA(ds, index);});
|
||||||
|
this._parser._interp = new LSParserATNSimulator(this._parser, this._parser.atn, decisionsToDFA, new antlr4.PredictionContextCache(), this);
|
||||||
|
this._parser.root();
|
||||||
|
}
|
||||||
|
|
||||||
|
public GetExpectedTokenStrs = function() : string[] {
|
||||||
|
let intervalSets = new IntervalSet();
|
||||||
|
for (var key in this.StatesBeforeEof) {
|
||||||
|
if (this.StatesBeforeEof.hasOwnProperty(key)) {
|
||||||
|
intervalSets.addSet(this.StatesBeforeEof[key].ExpectedTokens);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var expectedStrings = [];
|
||||||
|
if (intervalSets.intervals === null) {
|
||||||
|
return expectedStrings;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (var i = 0; i < intervalSets.intervals.length; i++) {
|
||||||
|
var v = intervalSets.intervals[i];
|
||||||
|
if (v.start < 0) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (var j = v.start; j < v.stop; j++) {
|
||||||
|
var tokenString = this._parser._input.tokenSource.symbolicNames[j];
|
||||||
|
if (tokenString != null) {
|
||||||
|
let keyword = this._keywordsDict[tokenString.replace(/^\'|\'$/gi, "")];
|
||||||
|
if (keyword != null) {
|
||||||
|
expectedStrings.push(keyword);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return expectedStrings.filter(Utils.notDuplicate);
|
||||||
|
}
|
||||||
|
|
||||||
|
public RecordStateBeforeEof = function() {
|
||||||
|
if (!this.IsInPredict) {
|
||||||
|
this.EofReached = true;
|
||||||
|
if (!this.ExThrownAfterEofReached) {
|
||||||
|
if (this.StatesBeforeEof[this._parser.state] == undefined || this.StatesBeforeEof[this._parser.state] == null) {
|
||||||
|
this.StatesBeforeEof[this._parser.state] = new StateContext(this._parser.state, this._parser._ctx.ruleIndex, this._parser.getExpectedTokens(), this._parser.getRuleInvocationStack());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
this.EofReachedInPredict = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public RecordErrorStatesBeforeEof = function(states : ATNState.ATNState[]) {
|
||||||
|
if (states.length > 0) {
|
||||||
|
states.forEach(state => {
|
||||||
|
if (state != null) {
|
||||||
|
if (this.StatesBeforeEof[state.stateNumber] == undefined || this.StatesBeforeEof[state.stateNumber] == null) {
|
||||||
|
this.StatesBeforeEof[state.stateNumber] = new StateContext(state.stateNumber, state.ruleIndex, this._parser._interp.atn.nextTokens(state), this._parser.getRuleInvocationStack());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public AddSyntaxError = (msg : string, line : number, column : number) : any => {
|
||||||
|
let error : ErrorMarkItem = {
|
||||||
|
line : line,
|
||||||
|
column : column,
|
||||||
|
Message : msg
|
||||||
|
};
|
||||||
|
|
||||||
|
this.SyntaxErrors.push(error);
|
||||||
|
|
||||||
|
if (this.EofReached) {
|
||||||
|
this.ExThrownAfterEofReached = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public PrepareParse() : any {
|
||||||
|
this.EofReached = false;
|
||||||
|
this.EofReachedInPredict = false;
|
||||||
|
this.ExThrownAfterEofReached = false;
|
||||||
|
this.StatesBeforeEof = {};
|
||||||
|
this.SyntaxErrors = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
public getCompletionWords(input : string) : string[] {
|
||||||
|
this._parse(input);
|
||||||
|
return this.GetExpectedTokenStrs();
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSyntaxErrors(input : string) : ErrorMarkItem[] {
|
||||||
|
this._parse(input);
|
||||||
|
return this.SyntaxErrors;
|
||||||
|
}
|
||||||
|
}
|
13
src/language-service/Utils.ts
Normal file
13
src/language-service/Utils.ts
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
// Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
//-----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
export class Utils {
|
||||||
|
public static notEmpty<TValue>(value : TValue | null | undefined) : value is TValue {
|
||||||
|
return value !== null && value !== undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static notDuplicate(item, pos, self) {
|
||||||
|
return self.indexOf(item) == pos;
|
||||||
|
}
|
||||||
|
}
|
9
src/providers/ErrorMarkProvider.ts
Normal file
9
src/providers/ErrorMarkProvider.ts
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
import * as Q from "q";
|
||||||
|
import {editor} from "monaco-editor"
|
||||||
|
import {LanguageServiceFacade, ParseReason} from "../facade/LanguageServiceFacade";
|
||||||
|
|
||||||
|
export class ErrorMarkProvider {
|
||||||
|
public static getErrorMark(input: string): Q.Promise<editor.IMarkerData[]> {
|
||||||
|
return LanguageServiceFacade.GetLanguageServiceParseResult(input, ParseReason.GetErrors);
|
||||||
|
}
|
||||||
|
}
|
19
src/providers/SqlCompletionItemProvider.ts
Normal file
19
src/providers/SqlCompletionItemProvider.ts
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
import {LanguageServiceFacade, ParseReason} from "../facade/LanguageServiceFacade";
|
||||||
|
import {editor, Position, CancellationToken} from "monaco-editor";
|
||||||
|
|
||||||
|
export class SqlCompletionItemProvider {
|
||||||
|
public triggerCharacters: string[] = [" ","."];
|
||||||
|
|
||||||
|
provideCompletionItems(model: editor.IReadOnlyModel, position: Position, token: CancellationToken) {
|
||||||
|
const range = {
|
||||||
|
startLineNumber: 1,
|
||||||
|
startColumn: 1,
|
||||||
|
endLineNumber: position.lineNumber,
|
||||||
|
endColumn: position.column
|
||||||
|
}
|
||||||
|
|
||||||
|
let text = model.getValueInRange(range);
|
||||||
|
text = this.triggerCharacters.indexOf(text.charAt(text.length - 1)) < 0 ? text.substring(0, text.length - 1) : text;
|
||||||
|
return LanguageServiceFacade.GetLanguageServiceParseResult(text, ParseReason.GetCompletionWords);
|
||||||
|
}
|
||||||
|
}
|
32
src/worker/LanguageServiceWorker.ts
Normal file
32
src/worker/LanguageServiceWorker.ts
Normal file
@ -0,0 +1,32 @@
|
|||||||
|
import { LanguageService } from "../language-service/LanguageService";
|
||||||
|
import { CosmosDBSqlLexer } from "../cosmosdb-sql/generated/CosmosDBSqlLexer";
|
||||||
|
import { CosmosDBSqlParser } from "../cosmosdb-sql/generated/CosmosDBSqlParser";
|
||||||
|
import { CosmosDBSqlKeywords } from "../cosmosdb-sql/grammar/CosmosDBSqlKeywords";
|
||||||
|
|
||||||
|
enum ParseReason {
|
||||||
|
GetCompletionWords = 1,
|
||||||
|
GetErrors = 2
|
||||||
|
}
|
||||||
|
|
||||||
|
export module LanguageServiceWorker {
|
||||||
|
// Respond to message from parent thread
|
||||||
|
onmessage = (event: MessageEvent) => {
|
||||||
|
const code: string = event.data.code;
|
||||||
|
const reason : number = event.data.reason;
|
||||||
|
|
||||||
|
let parseResults = [];
|
||||||
|
|
||||||
|
let languageService = new LanguageService(CosmosDBSqlLexer, CosmosDBSqlParser, CosmosDBSqlKeywords.keywordsRegisteredForCompletion);
|
||||||
|
|
||||||
|
if (reason == ParseReason.GetCompletionWords) {
|
||||||
|
parseResults = languageService.getCompletionWords(code);
|
||||||
|
} else if (reason == ParseReason.GetErrors) {
|
||||||
|
parseResults = languageService.getSyntaxErrors(code);
|
||||||
|
}
|
||||||
|
|
||||||
|
postMessage(parseResults, undefined);
|
||||||
|
close();
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
|
29
src/worker/webpack.config.js
Normal file
29
src/worker/webpack.config.js
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
entry: {
|
||||||
|
LanguageServiceWorker: './LanguageServiceWorker.ts'
|
||||||
|
},
|
||||||
|
mode: 'production',
|
||||||
|
target: 'web',
|
||||||
|
module: {
|
||||||
|
rules: [
|
||||||
|
{
|
||||||
|
test: /\.ts$/,
|
||||||
|
use: 'ts-loader',
|
||||||
|
exclude: /node_modules/
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
resolve: {
|
||||||
|
extensions: ['.ts', '.js']
|
||||||
|
},
|
||||||
|
output: {
|
||||||
|
globalObject: 'this',
|
||||||
|
path: __dirname + "/dist",
|
||||||
|
filename: '[name].js',
|
||||||
|
library: '[name]',
|
||||||
|
libraryTarget: 'umd'
|
||||||
|
},
|
||||||
|
node: { fs: "empty" }
|
||||||
|
};
|
17
test/ManualTestDescription.txt
Normal file
17
test/ManualTestDescription.txt
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
Use of language service package :
|
||||||
|
- Define antlr grammar for your language. Generate parser and lexer using antlr command. And define the keywords dictionary. Key of item in dictionary is keyword defined in Lexer.g4 and value is the word that used as expected token shown.
|
||||||
|
- Initialize LanguageService in package with parser, lexer, and keywords dictionary. Parser and lexer are generated by grammar file and keywords dictionary is defined by language service users.
|
||||||
|
- Use getCompletionWords method of LanguageService to get expected keyword for current input script.
|
||||||
|
- Use getSyntaxErrors method of LanguageService to get syntax errors for current input script.
|
||||||
|
|
||||||
|
Manual Test :
|
||||||
|
step :
|
||||||
|
- Define grammar such as "SELECT FROM" in client side.
|
||||||
|
- Open SQL query editor.
|
||||||
|
- Type "SELECT" and then type whitespace to get expected token.
|
||||||
|
expected :
|
||||||
|
- "FROM" token is in the expected token list that is shown in the IDE.
|
||||||
|
- With error marker under "SELECT" token. When hovering on the token, error message of "mismatched input '<EOF>' expecting {FROM}" will be shown.
|
||||||
|
|
||||||
|
|
||||||
|
|
19
tsconfig.json
Normal file
19
tsconfig.json
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"outDir": "./dist",
|
||||||
|
"allowJs": true,
|
||||||
|
"sourceMap": true,
|
||||||
|
"noImplicitReturns": true,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"module": "commonjs",
|
||||||
|
"target": "es5",
|
||||||
|
"lib": [
|
||||||
|
"es5",
|
||||||
|
"es6",
|
||||||
|
"dom"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"include": [
|
||||||
|
"./src/**/*"
|
||||||
|
]
|
||||||
|
}
|
Reference in New Issue
Block a user