From 69a16000b353f15672ba16792a1f7ce0c0b85bd9 Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sat, 18 Mar 2023 17:21:06 -0700 Subject: [PATCH 1/8] WIP, begin documentation for file uploads --- docs/reference/_category_.json | 2 +- docs/serverExtensions/_category_.json | 5 + docs/serverExtensions/batch-processing.md | 26 ++ docs/serverExtensions/file-uploads.md | 157 +++++++++ package.json | 6 +- yarn.lock | 378 +++++++++++----------- 6 files changed, 390 insertions(+), 184 deletions(-) create mode 100644 docs/serverExtensions/_category_.json create mode 100644 docs/serverExtensions/batch-processing.md create mode 100644 docs/serverExtensions/file-uploads.md diff --git a/docs/reference/_category_.json b/docs/reference/_category_.json index 022c4df..5d9d820 100644 --- a/docs/reference/_category_.json +++ b/docs/reference/_category_.json @@ -1,5 +1,5 @@ { "label": "References", - "position": 8, + "position": 9, "collapsed": false } \ No newline at end of file diff --git a/docs/serverExtensions/_category_.json b/docs/serverExtensions/_category_.json new file mode 100644 index 0000000..52eeb8f --- /dev/null +++ b/docs/serverExtensions/_category_.json @@ -0,0 +1,5 @@ +{ + "label": "Extensions", + "position": 8, + "collapsed": false +} \ No newline at end of file diff --git a/docs/serverExtensions/batch-processing.md b/docs/serverExtensions/batch-processing.md new file mode 100644 index 0000000..1803e47 --- /dev/null +++ b/docs/serverExtensions/batch-processing.md @@ -0,0 +1,26 @@ +--- +id: batch-processing +title: Batch Query Processing +sidebar_label: Batch Processing +sidebar_position: 1 +--- + +## GraphQL Multipart Request Specification +GraphQL ASP.NET provides built in support for batch query processing via an implementation of the `GraphQL Multipart Request Specification`. You can read the specification [here](https://github.com/jaydenseric/graphql-multipart-request-spec) if you are interested in the details. + +## Enable Batch Query Support + +While batch query support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each individual schema. + +```csharp title='Register the Server Extension' +// Startup Code +// other code omitted for brevity +services.AddGraphQL(options => { + options.RegisterExtension(); +}); +``` + + +:::note +Batch query processing and [file uploads](./file-uploads.md) are implemented as part of the same specification and therefore are encapsulated in the same extension. +::: \ No newline at end of file diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md new file mode 100644 index 0000000..7ef00d4 --- /dev/null +++ b/docs/serverExtensions/file-uploads.md @@ -0,0 +1,157 @@ +--- +id: file-uploads +title: File Uploads +sidebar_label: File Uploads +sidebar_position: 0 +--- + +## GraphQL Multipart Request Specification +GraphQL ASP.NET provides built in support for file uploads via an implementation of the `GraphQL Multipart Request Specification`. You can read the +specification [here](https://github.com/jaydenseric/graphql-multipart-request-spec) if you are interested in the details. + +:::caution +This document covers how to setup a controller to accept files from an http request that conforms to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. +::: + +## Enable File Upload Support + +While file upload support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each individual schema. + +```csharp title='Register the Server Extension' +// Startup Code +// other code omitted for brevity +services.AddGraphQL(options => { + options.RegisterExtension(); +}); +``` + +:::tip +File uploads and [batch query processing](./batch-processing.md) are implemented as part of the same specification and are encapsulated in the same extension. +::: + + +## A Basic Controller + +Files are received as a special scalar type `GraphQL.AspNet.ServerExtensions.MultipartRequests.FileUpload`. Add a reference in your controller to this +scalar like you would any other scalar. + +```csharp title=ExampleFile Upload Controller +using GraphQL.AspNet.ServerExtensions.MultipartRequests; + +public class FileUploadController : GraphController +{ + [MutationRoot("singleFileUpload")] + public async Task UploadFile(FileUpload fileRef) + { + var stream = await fileRef.OpenFileAsync(); + // do something with the file stream + + return 0; + } +} +``` + +The scalar is presented to a query as the type `Upload` as defined in the specification. + +```bash title="Sample Query" +curl localhost:3000/graphql \ + -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ + -F map='{ "0": ["variables", "file"] }' \ + -F 0=@a.txt +``` + +## Handling Arrays of Files + +Arrays of files work just like any other list in GraphQL. When declaring the map variable for the multi-part request, be sure +to indicate which index you are mapping the file to. The extension will not magically append files to an array. Each mapped file must explicitly declare the element index in an array where it is being placed. + +```csharp title=ExampleFile Upload Controller +using GraphQL.AspNet.ServerExtensions.MultipartRequests; + +public class FileUploadController : GraphController +{ + [MutationRoot("multiFileUpload")] + public async Task UploadFile(IList files) + { + foreach(var file in files) + { + using var stream = await fileRef.OpenFileAsync(); + // do something with each file stream + } + + return 0; + } +} +``` + +```bash title="Sample Query" +curl localhost:3000/graphql \ + -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }' \ + # highlight-next-line + -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + +:::info +The server extension will only replace existing `null` values in an array declared on a variable collection. It will NOT attempt to arbitrarily add files to an empty array. + +Notice in the above curl command that the `files` variable is declared as an array with two elements and that the `map` field +points to each of those indexed elements. +::: + +## File Uploads on Batched Queries +File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. + +```bash title="Sample Query" +curl localhost:3000/graphql \ + -F operations='[ + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }, + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }, + ]' \ + # highlight-next-line + -F map='{ "firstFile": [0, "variables", "files", 0], "secondFile": [1, "variables", "files", 0] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + +## The FileUpload Scalar +The following properties on the `FileUpload` scalar can be useful: + +* `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced +* `MapKey` - The key value used to place this file within a variable collection. This is usually the form field name on the multi-part request. +* `ContentType` - The supplied `content-type` value sent with the file. This value will be null for non-file fields. +* `Headers` - A collection of all the headers provided with the uploaded file. This value will be null for non-file fields. + +## Opening a File Stream +When opening a file stream you need to call await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. + +## Custom File Handling +By default, this extension segments the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are delt under the hood with as `IFormFile` references. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query. Perhaps you'll need to process the file stream multiple times? + +Implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the raw request. + +```csharp + public interface IFileUploadScalarValueMaker + { + Task CreateFileScalar(IFormFile aspNetFile); + + // This overload is used when processing data received on a + // multi-part form field rather than as a formal file upload. + Task CreateFileScalar(string mapKey, byte[] blobData); + } +``` + +```csharp title=Register Your Custom Value Maker +// startup code + +// register your value maker BEFORE calling .AddGraphQL +services.AddSingleton(); + +services.AddGraphQL(options => { + options.RegisterExtension(); +}); +``` +:::tip +Take a look at the [`default upload scalar value maker`]("http://google.com") for some helpful details when trying to implement your own. +::: diff --git a/package.json b/package.json index eb9ae82..68a3150 100644 --- a/package.json +++ b/package.json @@ -14,8 +14,8 @@ "write-heading-ids": "docusaurus write-heading-ids" }, "dependencies": { - "@docusaurus/core": "2.2.0", - "@docusaurus/preset-classic": "2.2.0", + "@docusaurus/core": "^2.3.1", + "@docusaurus/preset-classic": "^2.3.1", "@mdx-js/react": "^1.6.22", "clsx": "^1.2.1", "prism-react-renderer": "^1.3.5", @@ -23,7 +23,7 @@ "react-dom": "^17.0.2" }, "devDependencies": { - "@docusaurus/module-type-aliases": "2.2.0" + "@docusaurus/module-type-aliases": "^2.3.1" }, "browserslist": { "production": [ diff --git a/yarn.lock b/yarn.lock index 299a05d..f0173c1 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1195,10 +1195,10 @@ "@docsearch/css" "3.3.0" algoliasearch "^4.0.0" -"@docusaurus/core@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.2.0.tgz#64c9ee31502c23b93c869f8188f73afaf5fd4867" - integrity sha512-Vd6XOluKQqzG12fEs9prJgDtyn6DPok9vmUWDR2E6/nV5Fl9SVkhEQOBxwObjk3kQh7OY7vguFaLh0jqdApWsA== +"@docusaurus/core@2.3.1", "@docusaurus/core@^2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.3.1.tgz#32849f2ffd2f086a4e55739af8c4195c5eb386f2" + integrity sha512-0Jd4jtizqnRAr7svWaBbbrCCN8mzBNd2xFLoT/IM7bGfFie5y58oz97KzXliwiLY3zWjqMXjQcuP1a5VgCv2JA== dependencies: "@babel/core" "^7.18.6" "@babel/generator" "^7.18.7" @@ -1210,13 +1210,13 @@ "@babel/runtime" "^7.18.6" "@babel/runtime-corejs3" "^7.18.6" "@babel/traverse" "^7.18.8" - "@docusaurus/cssnano-preset" "2.2.0" - "@docusaurus/logger" "2.2.0" - "@docusaurus/mdx-loader" "2.2.0" + "@docusaurus/cssnano-preset" "2.3.1" + "@docusaurus/logger" "2.3.1" + "@docusaurus/mdx-loader" "2.3.1" "@docusaurus/react-loadable" "5.5.2" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-common" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-common" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" "@slorber/static-site-generator-webpack-plugin" "^4.0.7" "@svgr/webpack" "^6.2.1" autoprefixer "^10.4.7" @@ -1237,7 +1237,7 @@ del "^6.1.1" detect-port "^1.3.0" escape-html "^1.0.3" - eta "^1.12.3" + eta "^2.0.0" file-loader "^6.2.0" fs-extra "^10.1.0" html-minifier-terser "^6.1.0" @@ -1272,33 +1272,33 @@ webpack-merge "^5.8.0" webpackbar "^5.0.2" -"@docusaurus/cssnano-preset@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.2.0.tgz#fc05044659051ae74ab4482afcf4a9936e81d523" - integrity sha512-mAAwCo4n66TMWBH1kXnHVZsakW9VAXJzTO4yZukuL3ro4F+JtkMwKfh42EG75K/J/YIFQG5I/Bzy0UH/hFxaTg== +"@docusaurus/cssnano-preset@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.3.1.tgz#e042487655e3e062417855e12edb3f6eee8f5ecb" + integrity sha512-7mIhAROES6CY1GmCjR4CZkUfjTL6B3u6rKHK0ChQl2d1IevYXq/k/vFgvOrJfcKxiObpMnE9+X6R2Wt1KqxC6w== dependencies: cssnano-preset-advanced "^5.3.8" postcss "^8.4.14" postcss-sort-media-queries "^4.2.1" tslib "^2.4.0" -"@docusaurus/logger@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.2.0.tgz#ea2f7feda7b8675485933b87f06d9c976d17423f" - integrity sha512-DF3j1cA5y2nNsu/vk8AG7xwpZu6f5MKkPPMaaIbgXLnWGfm6+wkOeW7kNrxnM95YOhKUkJUophX69nGUnLsm0A== +"@docusaurus/logger@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.3.1.tgz#d76aefb452e3734b4e0e645efc6cbfc0aae52869" + integrity sha512-2lAV/olKKVr9qJhfHFCaqBIl8FgYjbUFwgUnX76+cULwQYss+42ZQ3grHGFvI0ocN2X55WcYe64ellQXz7suqg== dependencies: chalk "^4.1.2" tslib "^2.4.0" -"@docusaurus/mdx-loader@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.2.0.tgz#fd558f429e5d9403d284bd4214e54d9768b041a0" - integrity sha512-X2bzo3T0jW0VhUU+XdQofcEeozXOTmKQMvc8tUnWRdTnCvj4XEcBVdC3g+/jftceluiwSTNRAX4VBOJdNt18jA== +"@docusaurus/mdx-loader@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.3.1.tgz#7ec6acee5eff0a280e1b399ea4dd690b15a793f7" + integrity sha512-Gzga7OsxQRpt3392K9lv/bW4jGppdLFJh3luKRknCKSAaZrmVkOQv2gvCn8LAOSZ3uRg5No7AgYs/vpL8K94lA== dependencies: "@babel/parser" "^7.18.8" "@babel/traverse" "^7.18.8" - "@docusaurus/logger" "2.2.0" - "@docusaurus/utils" "2.2.0" + "@docusaurus/logger" "2.3.1" + "@docusaurus/utils" "2.3.1" "@mdx-js/mdx" "^1.6.22" escape-html "^1.0.3" file-loader "^6.2.0" @@ -1313,13 +1313,13 @@ url-loader "^4.1.1" webpack "^5.73.0" -"@docusaurus/module-type-aliases@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.2.0.tgz#1e23e54a1bbb6fde1961e4fa395b1b69f4803ba5" - integrity sha512-wDGW4IHKoOr9YuJgy7uYuKWrDrSpsUSDHLZnWQYM9fN7D5EpSmYHjFruUpKWVyxLpD/Wh0rW8hYZwdjJIQUQCQ== +"@docusaurus/module-type-aliases@2.3.1", "@docusaurus/module-type-aliases@^2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.3.1.tgz#986186200818fed999be2e18d6c698eaf4683a33" + integrity sha512-6KkxfAVOJqIUynTRb/tphYCl+co3cP0PlHiMDbi+SzmYxMdgIrwYqH9yAnGSDoN6Jk2ZE/JY/Azs/8LPgKP48A== dependencies: "@docusaurus/react-loadable" "5.5.2" - "@docusaurus/types" "2.2.0" + "@docusaurus/types" "2.3.1" "@types/history" "^4.7.11" "@types/react" "*" "@types/react-router-config" "*" @@ -1327,18 +1327,18 @@ react-helmet-async "*" react-loadable "npm:@docusaurus/react-loadable@5.5.2" -"@docusaurus/plugin-content-blog@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.2.0.tgz#dc55982e76771f4e678ac10e26d10e1da2011dc1" - integrity sha512-0mWBinEh0a5J2+8ZJXJXbrCk1tSTNf7Nm4tYAl5h2/xx+PvH/Bnu0V+7mMljYm/1QlDYALNIIaT/JcoZQFUN3w== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/logger" "2.2.0" - "@docusaurus/mdx-loader" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-common" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" +"@docusaurus/plugin-content-blog@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.3.1.tgz#236b8ee4f20f7047aa9c285ae77ae36683ad48a3" + integrity sha512-f5LjqX+9WkiLyGiQ41x/KGSJ/9bOjSD8lsVhPvYeUYHCtYpuiDKfhZE07O4EqpHkBx4NQdtQDbp+aptgHSTuiw== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/logger" "2.3.1" + "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-common" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" cheerio "^1.0.0-rc.12" feed "^4.2.2" fs-extra "^10.1.0" @@ -1349,18 +1349,18 @@ utility-types "^3.10.0" webpack "^5.73.0" -"@docusaurus/plugin-content-docs@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.2.0.tgz#0fcb85226fcdb80dc1e2d4a36ef442a650dcc84d" - integrity sha512-BOazBR0XjzsHE+2K1wpNxz5QZmrJgmm3+0Re0EVPYFGW8qndCWGNtXW/0lGKhecVPML8yyFeAmnUCIs7xM2wPw== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/logger" "2.2.0" - "@docusaurus/mdx-loader" "2.2.0" - "@docusaurus/module-type-aliases" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" +"@docusaurus/plugin-content-docs@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.3.1.tgz#feae1555479558a55182f22f8a07acc5e0d7444d" + integrity sha512-DxztTOBEruv7qFxqUtbsqXeNcHqcVEIEe+NQoI1oi2DBmKBhW/o0MIal8lt+9gvmpx3oYtlwmLOOGepxZgJGkw== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/logger" "2.3.1" + "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/module-type-aliases" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" "@types/react-router-config" "^5.0.6" combine-promises "^1.1.0" fs-extra "^10.1.0" @@ -1371,84 +1371,95 @@ utility-types "^3.10.0" webpack "^5.73.0" -"@docusaurus/plugin-content-pages@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.2.0.tgz#e3f40408787bbe229545dd50595f87e1393bc3ae" - integrity sha512-+OTK3FQHk5WMvdelz8v19PbEbx+CNT6VSpx7nVOvMNs5yJCKvmqBJBQ2ZSxROxhVDYn+CZOlmyrC56NSXzHf6g== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/mdx-loader" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" +"@docusaurus/plugin-content-pages@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.3.1.tgz#f534a37862be5b3f2ba5b150458d7527646b6f39" + integrity sha512-E80UL6hvKm5VVw8Ka8YaVDtO6kWWDVUK4fffGvkpQ/AJQDOg99LwOXKujPoICC22nUFTsZ2Hp70XvpezCsFQaA== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" fs-extra "^10.1.0" tslib "^2.4.0" webpack "^5.73.0" -"@docusaurus/plugin-debug@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.2.0.tgz#b38741d2c492f405fee01ee0ef2e0029cedb689a" - integrity sha512-p9vOep8+7OVl6r/NREEYxf4HMAjV8JMYJ7Bos5fCFO0Wyi9AZEo0sCTliRd7R8+dlJXZEgcngSdxAUo/Q+CJow== +"@docusaurus/plugin-debug@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.3.1.tgz#26fef904713e148f6dee44957506280f8b7853bb" + integrity sha512-Ujpml1Ppg4geB/2hyu2diWnO49az9U2bxM9Shen7b6qVcyFisNJTkVG2ocvLC7wM1efTJcUhBO6zAku2vKJGMw== dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" + "@docusaurus/core" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" fs-extra "^10.1.0" react-json-view "^1.21.3" tslib "^2.4.0" -"@docusaurus/plugin-google-analytics@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.2.0.tgz#63c7137eff5a1208d2059fea04b5207c037d7954" - integrity sha512-+eZVVxVeEnV5nVQJdey9ZsfyEVMls6VyWTIj8SmX0k5EbqGvnIfET+J2pYEuKQnDIHxy+syRMoRM6AHXdHYGIg== +"@docusaurus/plugin-google-analytics@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.3.1.tgz#e2e7db4cf6a7063e8ba5e128d4e413f4d6a0c862" + integrity sha512-OHip0GQxKOFU8n7gkt3TM4HOYTXPCFDjqKbMClDD3KaDnyTuMp/Zvd9HSr770lLEscgPWIvzhJByRAClqsUWiQ== dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" + "@docusaurus/core" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" tslib "^2.4.0" -"@docusaurus/plugin-google-gtag@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.2.0.tgz#7b086d169ac5fe9a88aca10ab0fd2bf00c6c6b12" - integrity sha512-6SOgczP/dYdkqUMGTRqgxAS1eTp6MnJDAQMy8VCF1QKbWZmlkx4agHDexihqmYyCujTYHqDAhm1hV26EET54NQ== +"@docusaurus/plugin-google-gtag@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.3.1.tgz#b8da54a60c0a50aca609c3643faef78cb4f247a0" + integrity sha512-uXtDhfu4+Hm+oqWUySr3DNI5cWC/rmP6XJyAk83Heor3dFjZqDwCbkX8yWPywkRiWev3Dk/rVF8lEn0vIGVocA== dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" + "@docusaurus/core" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" tslib "^2.4.0" -"@docusaurus/plugin-sitemap@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.2.0.tgz#876da60937886032d63143253d420db6a4b34773" - integrity sha512-0jAmyRDN/aI265CbWZNZuQpFqiZuo+5otk2MylU9iVrz/4J7gSc+ZJ9cy4EHrEsW7PV8s1w18hIEsmcA1YgkKg== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/logger" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-common" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" +"@docusaurus/plugin-google-tag-manager@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.3.1.tgz#f19bc01cc784fa4734187c5bc637f0574857e15d" + integrity sha512-Ww2BPEYSqg8q8tJdLYPFFM3FMDBCVhEM4UUqKzJaiRMx3NEoly3qqDRAoRDGdIhlC//Rf0iJV9cWAoq2m6k3sw== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" + tslib "^2.4.0" + +"@docusaurus/plugin-sitemap@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.3.1.tgz#f526ab517ca63b7a3460d585876f5952cb908aa0" + integrity sha512-8Yxile/v6QGYV9vgFiYL+8d2N4z4Er3pSHsrD08c5XI8bUXxTppMwjarDUTH/TRTfgAWotRbhJ6WZLyajLpozA== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/logger" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-common" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" fs-extra "^10.1.0" sitemap "^7.1.1" tslib "^2.4.0" -"@docusaurus/preset-classic@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.2.0.tgz#bece5a043eeb74430f7c6c7510000b9c43669eb7" - integrity sha512-yKIWPGNx7BT8v2wjFIWvYrS+nvN04W+UameSFf8lEiJk6pss0kL6SG2MRvyULiI3BDxH+tj6qe02ncpSPGwumg== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/plugin-content-blog" "2.2.0" - "@docusaurus/plugin-content-docs" "2.2.0" - "@docusaurus/plugin-content-pages" "2.2.0" - "@docusaurus/plugin-debug" "2.2.0" - "@docusaurus/plugin-google-analytics" "2.2.0" - "@docusaurus/plugin-google-gtag" "2.2.0" - "@docusaurus/plugin-sitemap" "2.2.0" - "@docusaurus/theme-classic" "2.2.0" - "@docusaurus/theme-common" "2.2.0" - "@docusaurus/theme-search-algolia" "2.2.0" - "@docusaurus/types" "2.2.0" +"@docusaurus/preset-classic@^2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.3.1.tgz#f0193f06093eb55cafef66bd1ad9e0d33198bf95" + integrity sha512-OQ5W0AHyfdUk0IldwJ3BlnZ1EqoJuu2L2BMhqLbqwNWdkmzmSUvlFLH1Pe7CZSQgB2YUUC/DnmjbPKk/qQD0lQ== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/plugin-content-blog" "2.3.1" + "@docusaurus/plugin-content-docs" "2.3.1" + "@docusaurus/plugin-content-pages" "2.3.1" + "@docusaurus/plugin-debug" "2.3.1" + "@docusaurus/plugin-google-analytics" "2.3.1" + "@docusaurus/plugin-google-gtag" "2.3.1" + "@docusaurus/plugin-google-tag-manager" "2.3.1" + "@docusaurus/plugin-sitemap" "2.3.1" + "@docusaurus/theme-classic" "2.3.1" + "@docusaurus/theme-common" "2.3.1" + "@docusaurus/theme-search-algolia" "2.3.1" + "@docusaurus/types" "2.3.1" "@docusaurus/react-loadable@5.5.2", "react-loadable@npm:@docusaurus/react-loadable@5.5.2": version "5.5.2" @@ -1458,23 +1469,23 @@ "@types/react" "*" prop-types "^15.6.2" -"@docusaurus/theme-classic@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.2.0.tgz#a048bb1bc077dee74b28bec25f4b84b481863742" - integrity sha512-kjbg/qJPwZ6H1CU/i9d4l/LcFgnuzeiGgMQlt6yPqKo0SOJIBMPuz7Rnu3r/WWbZFPi//o8acclacOzmXdUUEg== - dependencies: - "@docusaurus/core" "2.2.0" - "@docusaurus/mdx-loader" "2.2.0" - "@docusaurus/module-type-aliases" "2.2.0" - "@docusaurus/plugin-content-blog" "2.2.0" - "@docusaurus/plugin-content-docs" "2.2.0" - "@docusaurus/plugin-content-pages" "2.2.0" - "@docusaurus/theme-common" "2.2.0" - "@docusaurus/theme-translations" "2.2.0" - "@docusaurus/types" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-common" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" +"@docusaurus/theme-classic@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.3.1.tgz#8e6e194236e702c0d4e8d7b7cbb6886ae456e598" + integrity sha512-SelSIDvyttb7ZYHj8vEUhqykhAqfOPKk+uP0z85jH72IMC58e7O8DIlcAeBv+CWsLbNIl9/Hcg71X0jazuxJug== + dependencies: + "@docusaurus/core" "2.3.1" + "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/module-type-aliases" "2.3.1" + "@docusaurus/plugin-content-blog" "2.3.1" + "@docusaurus/plugin-content-docs" "2.3.1" + "@docusaurus/plugin-content-pages" "2.3.1" + "@docusaurus/theme-common" "2.3.1" + "@docusaurus/theme-translations" "2.3.1" + "@docusaurus/types" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-common" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" "@mdx-js/react" "^1.6.22" clsx "^1.2.1" copy-text-to-clipboard "^3.0.1" @@ -1489,17 +1500,17 @@ tslib "^2.4.0" utility-types "^3.10.0" -"@docusaurus/theme-common@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.2.0.tgz#2303498d80448aafdd588b597ce9d6f4cfa930e4" - integrity sha512-R8BnDjYoN90DCL75gP7qYQfSjyitXuP9TdzgsKDmSFPNyrdE3twtPNa2dIN+h+p/pr+PagfxwWbd6dn722A1Dw== - dependencies: - "@docusaurus/mdx-loader" "2.2.0" - "@docusaurus/module-type-aliases" "2.2.0" - "@docusaurus/plugin-content-blog" "2.2.0" - "@docusaurus/plugin-content-docs" "2.2.0" - "@docusaurus/plugin-content-pages" "2.2.0" - "@docusaurus/utils" "2.2.0" +"@docusaurus/theme-common@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.3.1.tgz#82f52d80226efef8c4418c4eacfc5051aa215f7f" + integrity sha512-RYmYl2OR2biO+yhmW1aS5FyEvnrItPINa+0U2dMxcHpah8reSCjQ9eJGRmAgkZFchV1+aIQzXOI1K7LCW38O0g== + dependencies: + "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/module-type-aliases" "2.3.1" + "@docusaurus/plugin-content-blog" "2.3.1" + "@docusaurus/plugin-content-docs" "2.3.1" + "@docusaurus/plugin-content-pages" "2.3.1" + "@docusaurus/utils" "2.3.1" "@types/history" "^4.7.11" "@types/react" "*" "@types/react-router-config" "*" @@ -1507,42 +1518,43 @@ parse-numeric-range "^1.3.0" prism-react-renderer "^1.3.5" tslib "^2.4.0" + use-sync-external-store "^1.2.0" utility-types "^3.10.0" -"@docusaurus/theme-search-algolia@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.2.0.tgz#77fd9f7a600917e6024fe3ac7fb6cfdf2ce84737" - integrity sha512-2h38B0tqlxgR2FZ9LpAkGrpDWVdXZ7vltfmTdX+4RsDs3A7khiNsmZB+x/x6sA4+G2V2CvrsPMlsYBy5X+cY1w== +"@docusaurus/theme-search-algolia@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.3.1.tgz#d587b40913119e9287d14670e277b933d8f453f0" + integrity sha512-JdHaRqRuH1X++g5fEMLnq7OtULSGQdrs9AbhcWRQ428ZB8/HOiaN6mj3hzHvcD3DFgu7koIVtWPQnvnN7iwzHA== dependencies: "@docsearch/react" "^3.1.1" - "@docusaurus/core" "2.2.0" - "@docusaurus/logger" "2.2.0" - "@docusaurus/plugin-content-docs" "2.2.0" - "@docusaurus/theme-common" "2.2.0" - "@docusaurus/theme-translations" "2.2.0" - "@docusaurus/utils" "2.2.0" - "@docusaurus/utils-validation" "2.2.0" + "@docusaurus/core" "2.3.1" + "@docusaurus/logger" "2.3.1" + "@docusaurus/plugin-content-docs" "2.3.1" + "@docusaurus/theme-common" "2.3.1" + "@docusaurus/theme-translations" "2.3.1" + "@docusaurus/utils" "2.3.1" + "@docusaurus/utils-validation" "2.3.1" algoliasearch "^4.13.1" algoliasearch-helper "^3.10.0" clsx "^1.2.1" - eta "^1.12.3" + eta "^2.0.0" fs-extra "^10.1.0" lodash "^4.17.21" tslib "^2.4.0" utility-types "^3.10.0" -"@docusaurus/theme-translations@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.2.0.tgz#5fbd4693679806f80c26eeae1381e1f2c23d83e7" - integrity sha512-3T140AG11OjJrtKlY4pMZ5BzbGRDjNs2co5hJ6uYJG1bVWlhcaFGqkaZ5lCgKflaNHD7UHBHU9Ec5f69jTdd6w== +"@docusaurus/theme-translations@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.3.1.tgz#b2b1ecc00a737881b5bfabc19f90b20f0fe02bb3" + integrity sha512-BsBZzAewJabVhoGG1Ij2u4pMS3MPW6gZ6sS4pc+Y7czevRpzxoFNJXRtQDVGe7mOpv/MmRmqg4owDK+lcOTCVQ== dependencies: fs-extra "^10.1.0" tslib "^2.4.0" -"@docusaurus/types@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.2.0.tgz#02c577a4041ab7d058a3c214ccb13647e21a9857" - integrity sha512-b6xxyoexfbRNRI8gjblzVOnLr4peCJhGbYGPpJ3LFqpi5nsFfoK4mmDLvWdeah0B7gmJeXabN7nQkFoqeSdmOw== +"@docusaurus/types@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.3.1.tgz#785ade2e0f4e35e1eb7fb0d04c27d11c3991a2e8" + integrity sha512-PREbIRhTaNNY042qmfSE372Jb7djZt+oVTZkoqHJ8eff8vOIc2zqqDqBVc5BhOfpZGPTrE078yy/torUEZy08A== dependencies: "@types/history" "^4.7.11" "@types/react" "*" @@ -1553,31 +1565,32 @@ webpack "^5.73.0" webpack-merge "^5.8.0" -"@docusaurus/utils-common@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.2.0.tgz#a401c1b93a8697dd566baf6ac64f0fdff1641a78" - integrity sha512-qebnerHp+cyovdUseDQyYFvMW1n1nv61zGe5JJfoNQUnjKuApch3IVsz+/lZ9a38pId8kqehC1Ao2bW/s0ntDA== +"@docusaurus/utils-common@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.3.1.tgz#1abe66846eb641547e4964d44f3011938e58e50b" + integrity sha512-pVlRpXkdNcxmKNxAaB1ya2hfCEvVsLDp2joeM6K6uv55Oc5nVIqgyYSgSNKZyMdw66NnvMfsu0RBylcwZQKo9A== dependencies: tslib "^2.4.0" -"@docusaurus/utils-validation@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.2.0.tgz#04d4d103137ad0145883971d3aa497f4a1315f25" - integrity sha512-I1hcsG3yoCkasOL5qQAYAfnmVoLei7apugT6m4crQjmDGxq+UkiRrq55UqmDDyZlac/6ax/JC0p+usZ6W4nVyg== +"@docusaurus/utils-validation@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.3.1.tgz#b65c718ba9b84b7a891bccf5ac6d19b57ee7d887" + integrity sha512-7n0208IG3k1HVTByMHlZoIDjjOFC8sbViHVXJx0r3Q+3Ezrx+VQ1RZ/zjNn6lT+QBCRCXlnlaoJ8ug4HIVgQ3w== dependencies: - "@docusaurus/logger" "2.2.0" - "@docusaurus/utils" "2.2.0" + "@docusaurus/logger" "2.3.1" + "@docusaurus/utils" "2.3.1" joi "^17.6.0" js-yaml "^4.1.0" tslib "^2.4.0" -"@docusaurus/utils@2.2.0": - version "2.2.0" - resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.2.0.tgz#3d6f9b7a69168d5c92d371bf21c556a4f50d1da6" - integrity sha512-oNk3cjvx7Tt1Lgh/aeZAmFpGV2pDr5nHKrBVx6hTkzGhrnMuQqLt6UPlQjdYQ3QHXwyF/ZtZMO1D5Pfi0lu7SA== +"@docusaurus/utils@2.3.1": + version "2.3.1" + resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.3.1.tgz#24b9cae3a23b1e6dc88f95c45722c7e82727b032" + integrity sha512-9WcQROCV0MmrpOQDXDGhtGMd52DHpSFbKLfkyaYumzbTstrbA5pPOtiGtxK1nqUHkiIv8UwexS54p0Vod2I1lg== dependencies: - "@docusaurus/logger" "2.2.0" + "@docusaurus/logger" "2.3.1" "@svgr/webpack" "^6.2.1" + escape-string-regexp "^4.0.0" file-loader "^6.2.0" fs-extra "^10.1.0" github-slugger "^1.4.0" @@ -3627,10 +3640,10 @@ esutils@^2.0.2: resolved "https://registry.yarnpkg.com/esutils/-/esutils-2.0.3.tgz#74d2eb4de0b8da1293711910d50775b9b710ef64" integrity sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g== -eta@^1.12.3: - version "1.12.3" - resolved "https://registry.yarnpkg.com/eta/-/eta-1.12.3.tgz#2982d08adfbef39f9fa50e2fbd42d7337e7338b1" - integrity sha512-qHixwbDLtekO/d51Yr4glcaUJCIjGVJyTzuqV4GPlgZo1YpgOKG+avQynErZIYrfM6JIJdtiG2Kox8tbb+DoGg== +eta@^2.0.0: + version "2.0.1" + resolved "https://registry.yarnpkg.com/eta/-/eta-2.0.1.tgz#199e675359cb6e19d38f29e1f405e1ba0e79a6df" + integrity sha512-46E2qDPDm7QA+usjffUWz9KfXsxVZclPOuKsXs4ZWZdI/X1wpDF7AO424pt7fdYohCzWsIkXAhNGXSlwo5naAg== etag@~1.8.1: version "1.8.1" @@ -7199,6 +7212,11 @@ use-latest@^1.2.1: dependencies: use-isomorphic-layout-effect "^1.1.1" +use-sync-external-store@^1.2.0: + version "1.2.0" + resolved "https://registry.yarnpkg.com/use-sync-external-store/-/use-sync-external-store-1.2.0.tgz#7dbefd6ef3fe4e767a0cf5d7287aacfb5846928a" + integrity sha512-eEgnFxGQ1Ife9bzYs6VLi8/4X6CObHMw9Qr9tPY43iKwsPw8xE8+EFsf/2cFZ5S3esXgpWgtSCtLNS41F+sKPA== + util-deprecate@^1.0.1, util-deprecate@^1.0.2, util-deprecate@~1.0.1: version "1.0.2" resolved "https://registry.yarnpkg.com/util-deprecate/-/util-deprecate-1.0.2.tgz#450d4dc9fa70de732762fbd2d4a28981419a0ccf" From 4954fa6aa91fcbfb846b44d0473ceeb75a412ecb Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sun, 19 Mar 2023 16:12:30 -0700 Subject: [PATCH 2/8] WIP, file upload updates --- docs/serverExtensions/file-uploads.md | 70 ++++++++++++++++++++++----- 1 file changed, 58 insertions(+), 12 deletions(-) diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md index 7ef00d4..78794e4 100644 --- a/docs/serverExtensions/file-uploads.md +++ b/docs/serverExtensions/file-uploads.md @@ -41,6 +41,7 @@ using GraphQL.AspNet.ServerExtensions.MultipartRequests; public class FileUploadController : GraphController { [MutationRoot("singleFileUpload")] + // highlight-next-line public async Task UploadFile(FileUpload fileRef) { var stream = await fileRef.OpenFileAsync(); @@ -51,10 +52,11 @@ public class FileUploadController : GraphController } ``` -The scalar is presented to a query as the type `Upload` as defined in the specification. +The scalar is presented to a query as the type `Upload` as defined in the specification. Be sure to declare your variables as `Upload` type to indicate an uploaded file. ```bash title="Sample Query" curl localhost:3000/graphql \ + # highlight-next-line -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ -F map='{ "0": ["variables", "file"] }' \ -F 0=@a.txt @@ -71,6 +73,7 @@ using GraphQL.AspNet.ServerExtensions.MultipartRequests; public class FileUploadController : GraphController { [MutationRoot("multiFileUpload")] + // highlight-next-line public async Task UploadFile(IList files) { foreach(var file in files) @@ -94,12 +97,30 @@ curl localhost:3000/graphql \ ``` :::info -The server extension will only replace existing `null` values in an array declared on a variable collection. It will NOT attempt to arbitrarily add files to an empty array. +The server extension will only replace existing `null` values in an array declared on a variable collection. It will NOT attempt to arbitrarily add files to an empty or null array. Notice in the above curl command that the `files` variable is declared as an array with two elements and that the `map` field points to each of those indexed elements. ::: +### Handling an Unknown Number of Files +There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. Some tools may be able to smartly determine the number of files and ensure the target array is declared correctly. However, you can always declare an array with more elements than you need. Those not supplied will be set to null. + +```bash title="Sample Query" + +# Only two files are supplied, but we've declared room for 6. File indexes 2-6 will be null +# when the list appears in your controller +curl localhost:3000/graphql \ + -F operations='{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + # highlight-next-line + "variables": { "files": [null, null, null, null, null, null] } }' \ + -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + + ## File Uploads on Batched Queries File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. @@ -126,23 +147,44 @@ The following properties on the `FileUpload` scalar can be useful: ## Opening a File Stream When opening a file stream you need to call await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. +```csharp title=ExampleFile Upload Controller +using GraphQL.AspNet.ServerExtensions.MultipartRequests; + +public class FileUploadController : GraphController +{ + [MutationRoot("singleFileUpload")] + public async Task UploadFile(FileUpload fileRef) + { + // do something with the file stream + // it is your responsibility to close it + // highlight-next-line + var stream = await fileRef.OpenFileAsync(); + + return 0; + } +} +``` + ## Custom File Handling -By default, this extension segments the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are delt under the hood with as `IFormFile` references. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query. Perhaps you'll need to process the file stream multiple times? +By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine at different times in a manner it expects. This means that any uploaded files are delt under the hood with as `IFormFile` references. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query. Perhaps you'll need to process the file stream multiple times? There are a number of niche cases where working through the raw `IFormFile` is not sufficient. -Implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the raw request. +You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the raw request. ```csharp public interface IFileUploadScalarValueMaker - { - Task CreateFileScalar(IFormFile aspNetFile); +{ + // This overload is used when processing traditional files received as part of a + // multi-part form through ASP.NET's HttpContext + Task CreateFileScalar(IFormFile aspNetFile); - // This overload is used when processing data received on a - // multi-part form field rather than as a formal file upload. - Task CreateFileScalar(string mapKey, byte[] blobData); - } + // This overload is used when processing data received on a + // multi-part form field rather than as a formal file upload. + Task CreateFileScalar(string mapKey, byte[] blobData); +} ``` -```csharp title=Register Your Custom Value Maker + +```csharp title="Register Your Custom Value Maker" // startup code // register your value maker BEFORE calling .AddGraphQL @@ -152,6 +194,10 @@ services.AddGraphQL(options => { options.RegisterExtension(); }); ``` + :::tip -Take a look at the [`default upload scalar value maker`]("http://google.com") for some helpful details when trying to implement your own. +You can inherit from `FileUpload` any extend it as needed. ::: + +Take a look at the [default upload scalar value maker]("http://google.com") for some helpful details when trying to implement your own. + From e443c09d33ec498863bf1c233d0c1f4fc67b6e17 Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Mon, 3 Apr 2023 18:36:30 -0700 Subject: [PATCH 3/8] WIP --- docs/serverExtensions/batch-processing.md | 170 +++++++++++++++++++++- docs/serverExtensions/file-uploads.md | 44 +++--- 2 files changed, 189 insertions(+), 25 deletions(-) diff --git a/docs/serverExtensions/batch-processing.md b/docs/serverExtensions/batch-processing.md index 1803e47..9d6250e 100644 --- a/docs/serverExtensions/batch-processing.md +++ b/docs/serverExtensions/batch-processing.md @@ -4,9 +4,14 @@ title: Batch Query Processing sidebar_label: Batch Processing sidebar_position: 1 --- +.NET 6+ ## GraphQL Multipart Request Specification -GraphQL ASP.NET provides built in support for batch query processing via an implementation of the `GraphQL Multipart Request Specification`. You can read the specification [here](https://github.com/jaydenseric/graphql-multipart-request-spec) if you are interested in the details. +GraphQL ASP.NET provides built in support for batch query processing via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). + +:::caution +This document covers how to submit a batch query that conforms to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. +::: ## Enable Batch Query Support @@ -21,6 +26,167 @@ services.AddGraphQL(options => { ``` -:::note +:::tip Batch query processing and [file uploads](./file-uploads.md) are implemented as part of the same specification and therefore are encapsulated in the same extension. +::: + +## Processing a Single Query +Provide an "operations" form field that represents a single query and the engine will automatically detect and return a normal graphql response. + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-next-line + -F operations='{ "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }' \ +``` + +```json title="Example Json Serialized Response" +{ + "data": { + "findUser": { + "firstName": "Baily", + "lastName": "Smith" + } + } +} +``` + +:::info +The extension is backwards compatible with standard graphql http request processing. If a request is recieved that is not a multi-part form POST request, normal graphql processing will occur. +::: + +## Processing a Batch of Queries +Provide an "operations" form field that represents an array of graphql requests, the engine will automatically detect the array and return an array of responses in the same order as they were received. Each query is processed asyncronously and independently. + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-start + -F operations='[ + { "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }, + { "query": "query { findUser(lastName: \"Jones\") {firstName lastName} }" }, + ]' \ + # highlight-end +``` + +```json title="Example Json Serialized Response" +[ + { + "data": { + "findUser": { + "firstName": "Baily", + "lastName": "Smith" + } + } + }, + { + "data": { + "findUser": { + "firstName": "Caleb", + "lastName": "Jones" + } + } + } +] +``` +## Batch Order is Never Guaranteed +While the order of the results is guaranteed to be sorted in the same order in which the queries were received, there is no guarantee that the queries are executed in any specific order. This means if you submit a batch of 5 requests, each requests may complete in a randomized order. If the same batch is submitted 3 times, its possible that the execution order will be different each time. + +For queries this is usally not an issue, but if you are batching mutations, make sure you don't have any unexpected dependencies between queries. If your controllers perform business logic against an existing object and that object is modified by more than of your mutations its highly possible that the state of the object may be unexpectedly modified in some executions but not in others. + +Take this controller and query: +```csharp title=Example Controller + +public class FileUploadController : GraphController +{ + [MutationRoot("addMoney")] + // highlight-next-line + public async Task AddMoney(int itemId, int dollarsToAdd) + { + var item =await _service.RetrieveItem(itemId); + item.CurrentTotal += dollarsToAdd; + + await _service.UpdateItem(item); + return item; + } + + +} +``` + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-start + -F operations='[ + { "query": "mutation { addMoney(itemId: 34, dollarsToAdd: 5) {id currentTotal} }" }, + { "query": "mutation { addThreeDollars(itemId: 34, , dollarsToAdd: 3) {id currentTotal} }" }, + ]' \ + # highlight-end +``` + +Assuming that the initial value of `currentTotal` was 0, all three of these responses are equally likely to occur depending on the order in which the execution engine decides to process the queries. +```json title=Sample Json Results +// When the queries are executed in declared order +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 5 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 8 + } + } + }, +] + +// When the queries are executed in reverse order +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 8 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 3 + } + } + }, +] + +// When the queries are executed simultaniously +// The final result updated to the datastore is unknown +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 5 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 3 + } + } + }, +] +``` + +Under the hood, the batch process will parse and submit all queries to the engine simultaniously and wait for them to finish before structuring a result object. +:::caution +Ensure there are no dependencies between queries in a batch. An expected order of execution is never guaranteed. ::: \ No newline at end of file diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md index 78794e4..5ba96e4 100644 --- a/docs/serverExtensions/file-uploads.md +++ b/docs/serverExtensions/file-uploads.md @@ -4,10 +4,10 @@ title: File Uploads sidebar_label: File Uploads sidebar_position: 0 --- +.NET 6+ ## GraphQL Multipart Request Specification -GraphQL ASP.NET provides built in support for file uploads via an implementation of the `GraphQL Multipart Request Specification`. You can read the -specification [here](https://github.com/jaydenseric/graphql-multipart-request-spec) if you are interested in the details. +GraphQL ASP.NET provides built in support for file uploads via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). :::caution This document covers how to setup a controller to accept files from an http request that conforms to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. @@ -32,12 +32,10 @@ File uploads and [batch query processing](./batch-processing.md) are implemented ## A Basic Controller -Files are received as a special scalar type `GraphQL.AspNet.ServerExtensions.MultipartRequests.FileUpload`. Add a reference in your controller to this +Files are received as a special scalar type `FileUpload`. Add a reference in your controller to this scalar like you would any other scalar. ```csharp title=ExampleFile Upload Controller -using GraphQL.AspNet.ServerExtensions.MultipartRequests; - public class FileUploadController : GraphController { [MutationRoot("singleFileUpload")] @@ -54,7 +52,13 @@ public class FileUploadController : GraphController The scalar is presented to a query as the type `Upload` as defined in the specification. Be sure to declare your variables as `Upload` type to indicate an uploaded file. -```bash title="Sample Query" +```graphql title="Use the Upload graph type for variables" +mutation ($file: Upload) { + singleFileUpload(file: $file) +} +``` + +```bash title="Sample curl Query" curl localhost:3000/graphql \ # highlight-next-line -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ @@ -96,15 +100,8 @@ curl localhost:3000/graphql \ -F secondFile=@b.txt ``` -:::info -The server extension will only replace existing `null` values in an array declared on a variable collection. It will NOT attempt to arbitrarily add files to an empty or null array. - -Notice in the above curl command that the `files` variable is declared as an array with two elements and that the `map` field -points to each of those indexed elements. -::: - ### Handling an Unknown Number of Files -There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. Some tools may be able to smartly determine the number of files and ensure the target array is declared correctly. However, you can always declare an array with more elements than you need. Those not supplied will be set to null. +There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long as your indicating path points to an index value, the target array will be resized accordingly. ```bash title="Sample Query" @@ -114,12 +111,12 @@ curl localhost:3000/graphql \ -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", # highlight-next-line - "variables": { "files": [null, null, null, null, null, null] } }' \ + "variables": { "files": [] } }' \ -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ -F firstFile=@a.txt -F secondFile=@b.txt ``` - +_In the above example, the `files` array will be automatically expanded to include indexes 0 and 1 as requested by the `map`._ ## File Uploads on Batched Queries File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. @@ -127,8 +124,8 @@ File uploads work in conjunction with batched queries. When processing a multi-p ```bash title="Sample Query" curl localhost:3000/graphql \ -F operations='[ - { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }, - { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }, + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, ]' \ # highlight-next-line -F map='{ "firstFile": [0, "variables", "files", 0], "secondFile": [1, "variables", "files", 0] }' \ @@ -156,19 +153,20 @@ public class FileUploadController : GraphController public async Task UploadFile(FileUpload fileRef) { // do something with the file stream - // it is your responsibility to close it + // it is your responsibility to close and dispose of it // highlight-next-line - var stream = await fileRef.OpenFileAsync(); + var stream = await fileRef.OpenFileStreamAsync(); return 0; } } ``` + ## Custom File Handling -By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine at different times in a manner it expects. This means that any uploaded files are delt under the hood with as `IFormFile` references. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query. Perhaps you'll need to process the file stream multiple times? There are a number of niche cases where working through the raw `IFormFile` is not sufficient. +By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times? There are a number of niche cases where working through the raw `IFormFile` is not sufficient. -You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the raw request. +You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the http request. ```csharp public interface IFileUploadScalarValueMaker @@ -199,5 +197,5 @@ services.AddGraphQL(options => { You can inherit from `FileUpload` any extend it as needed. ::: -Take a look at the [default upload scalar value maker]("http://google.com") for some helpful details when trying to implement your own. +Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. From 2306cea655a0a9ae2ea68b57199e97255c40f1fb Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Mon, 3 Apr 2023 19:02:31 -0700 Subject: [PATCH 4/8] WIP, updated docusaurus and fileupload page --- docs/serverExtensions/file-uploads.md | 81 ++++-- package.json | 6 +- yarn.lock | 375 +++++++++++++------------- 3 files changed, 254 insertions(+), 208 deletions(-) diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md index 5ba96e4..2603454 100644 --- a/docs/serverExtensions/file-uploads.md +++ b/docs/serverExtensions/file-uploads.md @@ -50,7 +50,7 @@ public class FileUploadController : GraphController } ``` -The scalar is presented to a query as the type `Upload` as defined in the specification. Be sure to declare your variables as `Upload` type to indicate an uploaded file. +The scalar is named `Upload` in the final graphql schema as defined in the specification. Be sure to declare your variables as an `Upload` type to indicate an uploaded file. ```graphql title="Use the Upload graph type for variables" mutation ($file: Upload) { @@ -62,7 +62,7 @@ mutation ($file: Upload) { curl localhost:3000/graphql \ # highlight-next-line -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ - -F map='{ "0": ["variables", "file"] }' \ + -F map='{ "0": ["variables.file"] }' \ -F 0=@a.txt ``` @@ -71,14 +71,18 @@ curl localhost:3000/graphql \ Arrays of files work just like any other list in GraphQL. When declaring the map variable for the multi-part request, be sure to indicate which index you are mapping the file to. The extension will not magically append files to an array. Each mapped file must explicitly declare the element index in an array where it is being placed. -```csharp title=ExampleFile Upload Controller +Warning: Be sure to dispose of each file stream when you are finished with them +
+
+ +```csharp title="Example File Upload Controller" using GraphQL.AspNet.ServerExtensions.MultipartRequests; public class FileUploadController : GraphController { [MutationRoot("multiFileUpload")] // highlight-next-line - public async Task UploadFile(IList files) + public async Task UploadFile(IEnumerable files) { foreach(var file in files) { @@ -91,22 +95,26 @@ public class FileUploadController : GraphController } ``` -```bash title="Sample Query" +```graphql title="Declare a list of files on a query" +# highlight-next-line +mutation ($files: [Upload]) { + multiFileUpload(file: $files) +} +``` + +```bash title="Sample Curl" curl localhost:3000/graphql \ - -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }' \ - # highlight-next-line +# highlight-next-line + -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }' \ -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ -F firstFile=@a.txt -F secondFile=@b.txt ``` ### Handling an Unknown Number of Files -There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long as your indicating path points to an index value, the target array will be resized accordingly. - -```bash title="Sample Query" +There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long each declaration in your `map` field points to a position that could be a valid index in an array on the `operations` object, the target array will be resized accordingly. -# Only two files are supplied, but we've declared room for 6. File indexes 2-6 will be null -# when the list appears in your controller +```bash title="Adding Two Files" curl localhost:3000/graphql \ -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", @@ -116,7 +124,39 @@ curl localhost:3000/graphql \ -F firstFile=@a.txt -F secondFile=@b.txt ``` -_In the above example, the `files` array will be automatically expanded to include indexes 0 and 1 as requested by the `map`._ + + +In the above example, the `files` array will be automatically expanded to include indexes 0 and 1 as requested by the `map`: + +```json title="Resultant Operations Object" +{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + "variables": { "files": [, ] } +} +``` + +### Skipping Array Indexes +If you skip any indexes in your `map` declaration, the target array will be expanded to to include the out of sequence index. This can produce null values in your array and, depending on your query declaration, result in an error if your variable does not allow nulls. + +```bash title="Adding One File To Index 5" +# Only one file is supplied but its mapped to index 5 +# the final array at 'variables.files` will be 6 elements long with 5 null elements. +curl localhost:3000/graphql \ + -F operations='{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + # highlight-next-line + "variables": { "files": [] } }' \ + # highlight-next-line + -F map='{ "firstFile": ["variables", "files", 5] }' \ + -F firstFile=@a.txt +``` + +```json title="Resultant Operations Object" +{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + "variables": { "files": [null, null, null, null, null, ] } +} +``` ## File Uploads on Batched Queries File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. @@ -136,13 +176,13 @@ curl localhost:3000/graphql \ ## The FileUpload Scalar The following properties on the `FileUpload` scalar can be useful: -* `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced +* `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced. * `MapKey` - The key value used to place this file within a variable collection. This is usually the form field name on the multi-part request. * `ContentType` - The supplied `content-type` value sent with the file. This value will be null for non-file fields. * `Headers` - A collection of all the headers provided with the uploaded file. This value will be null for non-file fields. ## Opening a File Stream -When opening a file stream you need to call await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. +When opening a file stream you need to await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. ```csharp title=ExampleFile Upload Controller using GraphQL.AspNet.ServerExtensions.MultipartRequests; @@ -162,9 +202,8 @@ public class FileUploadController : GraphController } ``` - ## Custom File Handling -By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is likely fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times? There are a number of niche cases where working through the raw `IFormFile` is not sufficient. +By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times? You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the http request. @@ -194,8 +233,14 @@ services.AddGraphQL(options => { ``` :::tip -You can inherit from `FileUpload` any extend it as needed. +You can inherit from `FileUpload` and extend it as needed. Just be sure to declare it as `FileUpload` in your controllers so that GraphQL knows +what scalar you are requesting. ::: Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. +## Timeouts and File Uploads + +Be mindful of any query timeouts you have set for your schemas. ASP.NET may start processing your query before all the file contents are made available to the server as long as it has the initial POST request. This also means that your graphql queries may start executing before the file contents arrive. + +While this asysncronicty usually works to your advantage, you may find that your queries pause on `.OpenFileStreamAsync()` waiting for the file for a long period of time if there is a network delay or a large file being uploaded. If you have a [custom timeout](../reference/schema-configuration.md#querytimeout) configured for a schema, it may trigger while waiting for the file. Be sure to set your timeouts to a long enough period of time to avoid this issue. \ No newline at end of file diff --git a/package.json b/package.json index 68a3150..d0cf1af 100644 --- a/package.json +++ b/package.json @@ -14,8 +14,8 @@ "write-heading-ids": "docusaurus write-heading-ids" }, "dependencies": { - "@docusaurus/core": "^2.3.1", - "@docusaurus/preset-classic": "^2.3.1", + "@docusaurus/core": "^2.4.0", + "@docusaurus/preset-classic": "^2.4.0", "@mdx-js/react": "^1.6.22", "clsx": "^1.2.1", "prism-react-renderer": "^1.3.5", @@ -23,7 +23,7 @@ "react-dom": "^17.0.2" }, "devDependencies": { - "@docusaurus/module-type-aliases": "^2.3.1" + "@docusaurus/module-type-aliases": "^2.4.0" }, "browserslist": { "production": [ diff --git a/yarn.lock b/yarn.lock index f0173c1..0c430fa 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1195,10 +1195,10 @@ "@docsearch/css" "3.3.0" algoliasearch "^4.0.0" -"@docusaurus/core@2.3.1", "@docusaurus/core@^2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.3.1.tgz#32849f2ffd2f086a4e55739af8c4195c5eb386f2" - integrity sha512-0Jd4jtizqnRAr7svWaBbbrCCN8mzBNd2xFLoT/IM7bGfFie5y58oz97KzXliwiLY3zWjqMXjQcuP1a5VgCv2JA== +"@docusaurus/core@2.4.0", "@docusaurus/core@^2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.4.0.tgz#a12c175cb2e5a7e4582e65876a50813f6168913d" + integrity sha512-J55/WEoIpRcLf3afO5POHPguVZosKmJEQWKBL+K7TAnfuE7i+Y0NPLlkKtnWCehagGsgTqClfQEexH/UT4kELA== dependencies: "@babel/core" "^7.18.6" "@babel/generator" "^7.18.7" @@ -1210,13 +1210,13 @@ "@babel/runtime" "^7.18.6" "@babel/runtime-corejs3" "^7.18.6" "@babel/traverse" "^7.18.8" - "@docusaurus/cssnano-preset" "2.3.1" - "@docusaurus/logger" "2.3.1" - "@docusaurus/mdx-loader" "2.3.1" + "@docusaurus/cssnano-preset" "2.4.0" + "@docusaurus/logger" "2.4.0" + "@docusaurus/mdx-loader" "2.4.0" "@docusaurus/react-loadable" "5.5.2" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-common" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-common" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" "@slorber/static-site-generator-webpack-plugin" "^4.0.7" "@svgr/webpack" "^6.2.1" autoprefixer "^10.4.7" @@ -1272,33 +1272,33 @@ webpack-merge "^5.8.0" webpackbar "^5.0.2" -"@docusaurus/cssnano-preset@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.3.1.tgz#e042487655e3e062417855e12edb3f6eee8f5ecb" - integrity sha512-7mIhAROES6CY1GmCjR4CZkUfjTL6B3u6rKHK0ChQl2d1IevYXq/k/vFgvOrJfcKxiObpMnE9+X6R2Wt1KqxC6w== +"@docusaurus/cssnano-preset@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.4.0.tgz#9213586358e0cce517f614af041eb7d184f8add6" + integrity sha512-RmdiA3IpsLgZGXRzqnmTbGv43W4OD44PCo+6Q/aYjEM2V57vKCVqNzuafE94jv0z/PjHoXUrjr69SaRymBKYYw== dependencies: cssnano-preset-advanced "^5.3.8" postcss "^8.4.14" postcss-sort-media-queries "^4.2.1" tslib "^2.4.0" -"@docusaurus/logger@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.3.1.tgz#d76aefb452e3734b4e0e645efc6cbfc0aae52869" - integrity sha512-2lAV/olKKVr9qJhfHFCaqBIl8FgYjbUFwgUnX76+cULwQYss+42ZQ3grHGFvI0ocN2X55WcYe64ellQXz7suqg== +"@docusaurus/logger@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.4.0.tgz#393d91ad9ecdb9a8f80167dd6a34d4b45219b835" + integrity sha512-T8+qR4APN+MjcC9yL2Es+xPJ2923S9hpzDmMtdsOcUGLqpCGBbU1vp3AAqDwXtVgFkq+NsEk7sHdVsfLWR/AXw== dependencies: chalk "^4.1.2" tslib "^2.4.0" -"@docusaurus/mdx-loader@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.3.1.tgz#7ec6acee5eff0a280e1b399ea4dd690b15a793f7" - integrity sha512-Gzga7OsxQRpt3392K9lv/bW4jGppdLFJh3luKRknCKSAaZrmVkOQv2gvCn8LAOSZ3uRg5No7AgYs/vpL8K94lA== +"@docusaurus/mdx-loader@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.4.0.tgz#c6310342904af2f203e7df86a9df623f86840f2d" + integrity sha512-GWoH4izZKOmFoC+gbI2/y8deH/xKLvzz/T5BsEexBye8EHQlwsA7FMrVa48N063bJBH4FUOiRRXxk5rq9cC36g== dependencies: "@babel/parser" "^7.18.8" "@babel/traverse" "^7.18.8" - "@docusaurus/logger" "2.3.1" - "@docusaurus/utils" "2.3.1" + "@docusaurus/logger" "2.4.0" + "@docusaurus/utils" "2.4.0" "@mdx-js/mdx" "^1.6.22" escape-html "^1.0.3" file-loader "^6.2.0" @@ -1313,13 +1313,13 @@ url-loader "^4.1.1" webpack "^5.73.0" -"@docusaurus/module-type-aliases@2.3.1", "@docusaurus/module-type-aliases@^2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.3.1.tgz#986186200818fed999be2e18d6c698eaf4683a33" - integrity sha512-6KkxfAVOJqIUynTRb/tphYCl+co3cP0PlHiMDbi+SzmYxMdgIrwYqH9yAnGSDoN6Jk2ZE/JY/Azs/8LPgKP48A== +"@docusaurus/module-type-aliases@2.4.0", "@docusaurus/module-type-aliases@^2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.4.0.tgz#6961605d20cd46f86163ed8c2d83d438b02b4028" + integrity sha512-YEQO2D3UXs72qCn8Cr+RlycSQXVGN9iEUyuHwTuK4/uL/HFomB2FHSU0vSDM23oLd+X/KibQ3Ez6nGjQLqXcHg== dependencies: "@docusaurus/react-loadable" "5.5.2" - "@docusaurus/types" "2.3.1" + "@docusaurus/types" "2.4.0" "@types/history" "^4.7.11" "@types/react" "*" "@types/react-router-config" "*" @@ -1327,18 +1327,18 @@ react-helmet-async "*" react-loadable "npm:@docusaurus/react-loadable@5.5.2" -"@docusaurus/plugin-content-blog@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.3.1.tgz#236b8ee4f20f7047aa9c285ae77ae36683ad48a3" - integrity sha512-f5LjqX+9WkiLyGiQ41x/KGSJ/9bOjSD8lsVhPvYeUYHCtYpuiDKfhZE07O4EqpHkBx4NQdtQDbp+aptgHSTuiw== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/logger" "2.3.1" - "@docusaurus/mdx-loader" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-common" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" +"@docusaurus/plugin-content-blog@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.4.0.tgz#50dbfbc7b51f152ae660385fd8b34076713374c3" + integrity sha512-YwkAkVUxtxoBAIj/MCb4ohN0SCtHBs4AS75jMhPpf67qf3j+U/4n33cELq7567hwyZ6fMz2GPJcVmctzlGGThQ== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/logger" "2.4.0" + "@docusaurus/mdx-loader" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-common" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" cheerio "^1.0.0-rc.12" feed "^4.2.2" fs-extra "^10.1.0" @@ -1349,18 +1349,18 @@ utility-types "^3.10.0" webpack "^5.73.0" -"@docusaurus/plugin-content-docs@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.3.1.tgz#feae1555479558a55182f22f8a07acc5e0d7444d" - integrity sha512-DxztTOBEruv7qFxqUtbsqXeNcHqcVEIEe+NQoI1oi2DBmKBhW/o0MIal8lt+9gvmpx3oYtlwmLOOGepxZgJGkw== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/logger" "2.3.1" - "@docusaurus/mdx-loader" "2.3.1" - "@docusaurus/module-type-aliases" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" +"@docusaurus/plugin-content-docs@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.4.0.tgz#36e235adf902325735b873b4f535205884363728" + integrity sha512-ic/Z/ZN5Rk/RQo+Io6rUGpToOtNbtPloMR2JcGwC1xT2riMu6zzfSwmBi9tHJgdXH6CB5jG+0dOZZO8QS5tmDg== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/logger" "2.4.0" + "@docusaurus/mdx-loader" "2.4.0" + "@docusaurus/module-type-aliases" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" "@types/react-router-config" "^5.0.6" combine-promises "^1.1.0" fs-extra "^10.1.0" @@ -1371,95 +1371,95 @@ utility-types "^3.10.0" webpack "^5.73.0" -"@docusaurus/plugin-content-pages@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.3.1.tgz#f534a37862be5b3f2ba5b150458d7527646b6f39" - integrity sha512-E80UL6hvKm5VVw8Ka8YaVDtO6kWWDVUK4fffGvkpQ/AJQDOg99LwOXKujPoICC22nUFTsZ2Hp70XvpezCsFQaA== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/mdx-loader" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" +"@docusaurus/plugin-content-pages@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.4.0.tgz#6169909a486e1eae0ddffff0b1717ce4332db4d4" + integrity sha512-Pk2pOeOxk8MeU3mrTU0XLIgP9NZixbdcJmJ7RUFrZp1Aj42nd0RhIT14BGvXXyqb8yTQlk4DmYGAzqOfBsFyGw== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/mdx-loader" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" fs-extra "^10.1.0" tslib "^2.4.0" webpack "^5.73.0" -"@docusaurus/plugin-debug@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.3.1.tgz#26fef904713e148f6dee44957506280f8b7853bb" - integrity sha512-Ujpml1Ppg4geB/2hyu2diWnO49az9U2bxM9Shen7b6qVcyFisNJTkVG2ocvLC7wM1efTJcUhBO6zAku2vKJGMw== +"@docusaurus/plugin-debug@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.4.0.tgz#1ad513fe9bcaf017deccf62df8b8843faeeb7d37" + integrity sha512-KC56DdYjYT7Txyux71vXHXGYZuP6yYtqwClvYpjKreWIHWus5Zt6VNi23rMZv3/QKhOCrN64zplUbdfQMvddBQ== dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" + "@docusaurus/core" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" fs-extra "^10.1.0" react-json-view "^1.21.3" tslib "^2.4.0" -"@docusaurus/plugin-google-analytics@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.3.1.tgz#e2e7db4cf6a7063e8ba5e128d4e413f4d6a0c862" - integrity sha512-OHip0GQxKOFU8n7gkt3TM4HOYTXPCFDjqKbMClDD3KaDnyTuMp/Zvd9HSr770lLEscgPWIvzhJByRAClqsUWiQ== +"@docusaurus/plugin-google-analytics@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.4.0.tgz#8062d7a09d366329dfd3ce4e8a619da8624b6cc3" + integrity sha512-uGUzX67DOAIglygdNrmMOvEp8qG03X20jMWadeqVQktS6nADvozpSLGx4J0xbkblhJkUzN21WiilsP9iVP+zkw== dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" + "@docusaurus/core" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" tslib "^2.4.0" -"@docusaurus/plugin-google-gtag@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.3.1.tgz#b8da54a60c0a50aca609c3643faef78cb4f247a0" - integrity sha512-uXtDhfu4+Hm+oqWUySr3DNI5cWC/rmP6XJyAk83Heor3dFjZqDwCbkX8yWPywkRiWev3Dk/rVF8lEn0vIGVocA== +"@docusaurus/plugin-google-gtag@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.4.0.tgz#a8efda476f971410dfb3aab1cfe1f0f7d269adc5" + integrity sha512-adj/70DANaQs2+TF/nRdMezDXFAV/O/pjAbUgmKBlyOTq5qoMe0Tk4muvQIwWUmiUQxFJe+sKlZGM771ownyOg== dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" + "@docusaurus/core" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" tslib "^2.4.0" -"@docusaurus/plugin-google-tag-manager@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.3.1.tgz#f19bc01cc784fa4734187c5bc637f0574857e15d" - integrity sha512-Ww2BPEYSqg8q8tJdLYPFFM3FMDBCVhEM4UUqKzJaiRMx3NEoly3qqDRAoRDGdIhlC//Rf0iJV9cWAoq2m6k3sw== +"@docusaurus/plugin-google-tag-manager@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.4.0.tgz#9a94324ac496835fc34e233cc60441df4e04dfdd" + integrity sha512-E66uGcYs4l7yitmp/8kMEVQftFPwV9iC62ORh47Veqzs6ExwnhzBkJmwDnwIysHBF1vlxnzET0Fl2LfL5fRR3A== dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" + "@docusaurus/core" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" tslib "^2.4.0" -"@docusaurus/plugin-sitemap@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.3.1.tgz#f526ab517ca63b7a3460d585876f5952cb908aa0" - integrity sha512-8Yxile/v6QGYV9vgFiYL+8d2N4z4Er3pSHsrD08c5XI8bUXxTppMwjarDUTH/TRTfgAWotRbhJ6WZLyajLpozA== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/logger" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-common" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" +"@docusaurus/plugin-sitemap@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.4.0.tgz#ba0eb43565039fe011bdd874b5c5d7252b19d709" + integrity sha512-pZxh+ygfnI657sN8a/FkYVIAmVv0CGk71QMKqJBOfMmDHNN1FeDeFkBjWP49ejBqpqAhjufkv5UWq3UOu2soCw== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/logger" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-common" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" fs-extra "^10.1.0" sitemap "^7.1.1" tslib "^2.4.0" -"@docusaurus/preset-classic@^2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.3.1.tgz#f0193f06093eb55cafef66bd1ad9e0d33198bf95" - integrity sha512-OQ5W0AHyfdUk0IldwJ3BlnZ1EqoJuu2L2BMhqLbqwNWdkmzmSUvlFLH1Pe7CZSQgB2YUUC/DnmjbPKk/qQD0lQ== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/plugin-content-blog" "2.3.1" - "@docusaurus/plugin-content-docs" "2.3.1" - "@docusaurus/plugin-content-pages" "2.3.1" - "@docusaurus/plugin-debug" "2.3.1" - "@docusaurus/plugin-google-analytics" "2.3.1" - "@docusaurus/plugin-google-gtag" "2.3.1" - "@docusaurus/plugin-google-tag-manager" "2.3.1" - "@docusaurus/plugin-sitemap" "2.3.1" - "@docusaurus/theme-classic" "2.3.1" - "@docusaurus/theme-common" "2.3.1" - "@docusaurus/theme-search-algolia" "2.3.1" - "@docusaurus/types" "2.3.1" +"@docusaurus/preset-classic@^2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.4.0.tgz#92fdcfab35d8d0ffb8c38bcbf439e4e1cb0566a3" + integrity sha512-/5z5o/9bc6+P5ool2y01PbJhoGddEGsC0ej1MF6mCoazk8A+kW4feoUd68l7Bnv01rCnG3xy7kHUQP97Y0grUA== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/plugin-content-blog" "2.4.0" + "@docusaurus/plugin-content-docs" "2.4.0" + "@docusaurus/plugin-content-pages" "2.4.0" + "@docusaurus/plugin-debug" "2.4.0" + "@docusaurus/plugin-google-analytics" "2.4.0" + "@docusaurus/plugin-google-gtag" "2.4.0" + "@docusaurus/plugin-google-tag-manager" "2.4.0" + "@docusaurus/plugin-sitemap" "2.4.0" + "@docusaurus/theme-classic" "2.4.0" + "@docusaurus/theme-common" "2.4.0" + "@docusaurus/theme-search-algolia" "2.4.0" + "@docusaurus/types" "2.4.0" "@docusaurus/react-loadable@5.5.2", "react-loadable@npm:@docusaurus/react-loadable@5.5.2": version "5.5.2" @@ -1469,27 +1469,27 @@ "@types/react" "*" prop-types "^15.6.2" -"@docusaurus/theme-classic@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.3.1.tgz#8e6e194236e702c0d4e8d7b7cbb6886ae456e598" - integrity sha512-SelSIDvyttb7ZYHj8vEUhqykhAqfOPKk+uP0z85jH72IMC58e7O8DIlcAeBv+CWsLbNIl9/Hcg71X0jazuxJug== - dependencies: - "@docusaurus/core" "2.3.1" - "@docusaurus/mdx-loader" "2.3.1" - "@docusaurus/module-type-aliases" "2.3.1" - "@docusaurus/plugin-content-blog" "2.3.1" - "@docusaurus/plugin-content-docs" "2.3.1" - "@docusaurus/plugin-content-pages" "2.3.1" - "@docusaurus/theme-common" "2.3.1" - "@docusaurus/theme-translations" "2.3.1" - "@docusaurus/types" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-common" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" +"@docusaurus/theme-classic@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.4.0.tgz#a5404967b00adec3472efca4c3b3f6a5e2021c78" + integrity sha512-GMDX5WU6Z0OC65eQFgl3iNNEbI9IMJz9f6KnOyuMxNUR6q0qVLsKCNopFUDfFNJ55UU50o7P7o21yVhkwpfJ9w== + dependencies: + "@docusaurus/core" "2.4.0" + "@docusaurus/mdx-loader" "2.4.0" + "@docusaurus/module-type-aliases" "2.4.0" + "@docusaurus/plugin-content-blog" "2.4.0" + "@docusaurus/plugin-content-docs" "2.4.0" + "@docusaurus/plugin-content-pages" "2.4.0" + "@docusaurus/theme-common" "2.4.0" + "@docusaurus/theme-translations" "2.4.0" + "@docusaurus/types" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-common" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" "@mdx-js/react" "^1.6.22" clsx "^1.2.1" copy-text-to-clipboard "^3.0.1" - infima "0.2.0-alpha.42" + infima "0.2.0-alpha.43" lodash "^4.17.21" nprogress "^0.2.0" postcss "^8.4.14" @@ -1500,17 +1500,18 @@ tslib "^2.4.0" utility-types "^3.10.0" -"@docusaurus/theme-common@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.3.1.tgz#82f52d80226efef8c4418c4eacfc5051aa215f7f" - integrity sha512-RYmYl2OR2biO+yhmW1aS5FyEvnrItPINa+0U2dMxcHpah8reSCjQ9eJGRmAgkZFchV1+aIQzXOI1K7LCW38O0g== - dependencies: - "@docusaurus/mdx-loader" "2.3.1" - "@docusaurus/module-type-aliases" "2.3.1" - "@docusaurus/plugin-content-blog" "2.3.1" - "@docusaurus/plugin-content-docs" "2.3.1" - "@docusaurus/plugin-content-pages" "2.3.1" - "@docusaurus/utils" "2.3.1" +"@docusaurus/theme-common@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.4.0.tgz#626096fe9552d240a2115b492c7e12099070cf2d" + integrity sha512-IkG/l5f/FLY6cBIxtPmFnxpuPzc5TupuqlOx+XDN+035MdQcAh8wHXXZJAkTeYDeZ3anIUSUIvWa7/nRKoQEfg== + dependencies: + "@docusaurus/mdx-loader" "2.4.0" + "@docusaurus/module-type-aliases" "2.4.0" + "@docusaurus/plugin-content-blog" "2.4.0" + "@docusaurus/plugin-content-docs" "2.4.0" + "@docusaurus/plugin-content-pages" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-common" "2.4.0" "@types/history" "^4.7.11" "@types/react" "*" "@types/react-router-config" "*" @@ -1521,19 +1522,19 @@ use-sync-external-store "^1.2.0" utility-types "^3.10.0" -"@docusaurus/theme-search-algolia@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.3.1.tgz#d587b40913119e9287d14670e277b933d8f453f0" - integrity sha512-JdHaRqRuH1X++g5fEMLnq7OtULSGQdrs9AbhcWRQ428ZB8/HOiaN6mj3hzHvcD3DFgu7koIVtWPQnvnN7iwzHA== +"@docusaurus/theme-search-algolia@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.4.0.tgz#07d297d50c44446d6bc5a37be39afb8f014084e1" + integrity sha512-pPCJSCL1Qt4pu/Z0uxBAuke0yEBbxh0s4fOvimna7TEcBLPq0x06/K78AaABXrTVQM6S0vdocFl9EoNgU17hqA== dependencies: "@docsearch/react" "^3.1.1" - "@docusaurus/core" "2.3.1" - "@docusaurus/logger" "2.3.1" - "@docusaurus/plugin-content-docs" "2.3.1" - "@docusaurus/theme-common" "2.3.1" - "@docusaurus/theme-translations" "2.3.1" - "@docusaurus/utils" "2.3.1" - "@docusaurus/utils-validation" "2.3.1" + "@docusaurus/core" "2.4.0" + "@docusaurus/logger" "2.4.0" + "@docusaurus/plugin-content-docs" "2.4.0" + "@docusaurus/theme-common" "2.4.0" + "@docusaurus/theme-translations" "2.4.0" + "@docusaurus/utils" "2.4.0" + "@docusaurus/utils-validation" "2.4.0" algoliasearch "^4.13.1" algoliasearch-helper "^3.10.0" clsx "^1.2.1" @@ -1543,18 +1544,18 @@ tslib "^2.4.0" utility-types "^3.10.0" -"@docusaurus/theme-translations@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.3.1.tgz#b2b1ecc00a737881b5bfabc19f90b20f0fe02bb3" - integrity sha512-BsBZzAewJabVhoGG1Ij2u4pMS3MPW6gZ6sS4pc+Y7czevRpzxoFNJXRtQDVGe7mOpv/MmRmqg4owDK+lcOTCVQ== +"@docusaurus/theme-translations@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.4.0.tgz#62dacb7997322f4c5a828b3ab66177ec6769eb33" + integrity sha512-kEoITnPXzDPUMBHk3+fzEzbopxLD3fR5sDoayNH0vXkpUukA88/aDL1bqkhxWZHA3LOfJ3f0vJbOwmnXW5v85Q== dependencies: fs-extra "^10.1.0" tslib "^2.4.0" -"@docusaurus/types@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.3.1.tgz#785ade2e0f4e35e1eb7fb0d04c27d11c3991a2e8" - integrity sha512-PREbIRhTaNNY042qmfSE372Jb7djZt+oVTZkoqHJ8eff8vOIc2zqqDqBVc5BhOfpZGPTrE078yy/torUEZy08A== +"@docusaurus/types@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.4.0.tgz#f94f89a0253778b617c5d40ac6f16b17ec55ce41" + integrity sha512-xaBXr+KIPDkIaef06c+i2HeTqVNixB7yFut5fBXPGI2f1rrmEV2vLMznNGsFwvZ5XmA3Quuefd4OGRkdo97Dhw== dependencies: "@types/history" "^4.7.11" "@types/react" "*" @@ -1565,30 +1566,30 @@ webpack "^5.73.0" webpack-merge "^5.8.0" -"@docusaurus/utils-common@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.3.1.tgz#1abe66846eb641547e4964d44f3011938e58e50b" - integrity sha512-pVlRpXkdNcxmKNxAaB1ya2hfCEvVsLDp2joeM6K6uv55Oc5nVIqgyYSgSNKZyMdw66NnvMfsu0RBylcwZQKo9A== +"@docusaurus/utils-common@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.4.0.tgz#eb2913871860ed32e73858b4c7787dd820c5558d" + integrity sha512-zIMf10xuKxddYfLg5cS19x44zud/E9I7lj3+0bv8UIs0aahpErfNrGhijEfJpAfikhQ8tL3m35nH3hJ3sOG82A== dependencies: tslib "^2.4.0" -"@docusaurus/utils-validation@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.3.1.tgz#b65c718ba9b84b7a891bccf5ac6d19b57ee7d887" - integrity sha512-7n0208IG3k1HVTByMHlZoIDjjOFC8sbViHVXJx0r3Q+3Ezrx+VQ1RZ/zjNn6lT+QBCRCXlnlaoJ8ug4HIVgQ3w== +"@docusaurus/utils-validation@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.4.0.tgz#1ed92bfab5da321c4a4d99cad28a15627091aa90" + integrity sha512-IrBsBbbAp6y7mZdJx4S4pIA7dUyWSA0GNosPk6ZJ0fX3uYIEQgcQSGIgTeSC+8xPEx3c16o03en1jSDpgQgz/w== dependencies: - "@docusaurus/logger" "2.3.1" - "@docusaurus/utils" "2.3.1" + "@docusaurus/logger" "2.4.0" + "@docusaurus/utils" "2.4.0" joi "^17.6.0" js-yaml "^4.1.0" tslib "^2.4.0" -"@docusaurus/utils@2.3.1": - version "2.3.1" - resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.3.1.tgz#24b9cae3a23b1e6dc88f95c45722c7e82727b032" - integrity sha512-9WcQROCV0MmrpOQDXDGhtGMd52DHpSFbKLfkyaYumzbTstrbA5pPOtiGtxK1nqUHkiIv8UwexS54p0Vod2I1lg== +"@docusaurus/utils@2.4.0": + version "2.4.0" + resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.4.0.tgz#fdf0c3545819e48bb57eafc5057495fd4d50e900" + integrity sha512-89hLYkvtRX92j+C+ERYTuSUK6nF9bGM32QThcHPg2EDDHVw6FzYQXmX6/p+pU5SDyyx5nBlE4qXR92RxCAOqfg== dependencies: - "@docusaurus/logger" "2.3.1" + "@docusaurus/logger" "2.4.0" "@svgr/webpack" "^6.2.1" escape-string-regexp "^4.0.0" file-loader "^6.2.0" @@ -4434,10 +4435,10 @@ indent-string@^4.0.0: resolved "https://registry.yarnpkg.com/indent-string/-/indent-string-4.0.0.tgz#624f8f4497d619b2d9768531d58f4122854d7251" integrity sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg== -infima@0.2.0-alpha.42: - version "0.2.0-alpha.42" - resolved "https://registry.yarnpkg.com/infima/-/infima-0.2.0-alpha.42.tgz#f6e86a655ad40877c6b4d11b2ede681eb5470aa5" - integrity sha512-ift8OXNbQQwtbIt6z16KnSWP7uJ/SysSMFI4F87MNRTicypfl4Pv3E2OGVv6N3nSZFJvA8imYulCBS64iyHYww== +infima@0.2.0-alpha.43: + version "0.2.0-alpha.43" + resolved "https://registry.yarnpkg.com/infima/-/infima-0.2.0-alpha.43.tgz#f7aa1d7b30b6c08afef441c726bac6150228cbe0" + integrity sha512-2uw57LvUqW0rK/SWYnd/2rRfxNA5DDNOh33jxF7fy46VWoNhGxiUQyVZHbBMjQ33mQem0cjdDVwgWVAmlRfgyQ== inflight@^1.0.4: version "1.0.6" From 862f0d14f7792cb07585b3da2fc266c0558b5697 Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sat, 29 Apr 2023 18:59:15 -0700 Subject: [PATCH 5/8] WIP, updated docs for multi-part request --- docs/serverExtensions/batch-processing.md | 13 +++---- docs/serverExtensions/file-uploads.md | 44 ++++++++++++----------- 2 files changed, 29 insertions(+), 28 deletions(-) diff --git a/docs/serverExtensions/batch-processing.md b/docs/serverExtensions/batch-processing.md index 9d6250e..1c710b1 100644 --- a/docs/serverExtensions/batch-processing.md +++ b/docs/serverExtensions/batch-processing.md @@ -21,7 +21,7 @@ While batch query support is shipped as part of the main library it is disabled // Startup Code // other code omitted for brevity services.AddGraphQL(options => { - options.RegisterExtension(); + options.AddMultipartRequestSupport(); }); ``` @@ -87,14 +87,13 @@ curl localhost:3000/graphql \ } ] ``` -## Batch Order is Never Guaranteed -While the order of the results is guaranteed to be sorted in the same order in which the queries were received, there is no guarantee that the queries are executed in any specific order. This means if you submit a batch of 5 requests, each requests may complete in a randomized order. If the same batch is submitted 3 times, its possible that the execution order will be different each time. +## Batch Execution Order is Never Guaranteed +While the order of the results is guaranteed to be the same order in which the queries were received, there is no guarantee that the queries are executed in any specific order. This means if you submit a batch of 5 requests, each requests may complete in a randomized order. If the same batch is submitted 3 times, its possible that the execution order will be different each time. -For queries this is usally not an issue, but if you are batching mutations, make sure you don't have any unexpected dependencies between queries. If your controllers perform business logic against an existing object and that object is modified by more than of your mutations its highly possible that the state of the object may be unexpectedly modified in some executions but not in others. +For queries this is usally not an issue, but if you are batching mutations, make sure you don't have any unexpected dependencies or side effects between queries. If your controllers perform business logic against an existing object and that object is modified by more than of your mutations its highly possible that the state of the object may be unexpectedly modified in some executions but not in others. Take this controller and query: -```csharp title=Example Controller - +```csharp title="Example Controller" public class FileUploadController : GraphController { [MutationRoot("addMoney")] @@ -107,8 +106,6 @@ public class FileUploadController : GraphController await _service.UpdateItem(item); return item; } - - } ``` diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md index 2603454..7a08e36 100644 --- a/docs/serverExtensions/file-uploads.md +++ b/docs/serverExtensions/file-uploads.md @@ -15,26 +15,30 @@ This document covers how to setup a controller to accept files from an http requ ## Enable File Upload Support -While file upload support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each individual schema. +While file upload support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each schema. ```csharp title='Register the Server Extension' // Startup Code // other code omitted for brevity services.AddGraphQL(options => { - options.RegisterExtension(); + options.AddMultipartRequestSupport(); }); ``` :::tip -File uploads and [batch query processing](./batch-processing.md) are implemented as part of the same specification and are encapsulated in the same extension. +File uploads and [batch query processing](./batch-processing.md) are implemented as part of the same specification and are encapsulated in the same "multi-part request" extension. ::: ## A Basic Controller -Files are received as a special scalar type `FileUpload`. Add a reference in your controller to this +Files are received as a special C# class named `FileUpload`. Add a reference in your controller to this scalar like you would any other scalar. +Warning: Be sure to dispose of the file stream when you are finished with it. +
+
+ ```csharp title=ExampleFile Upload Controller public class FileUploadController : GraphController { @@ -42,7 +46,7 @@ public class FileUploadController : GraphController // highlight-next-line public async Task UploadFile(FileUpload fileRef) { - var stream = await fileRef.OpenFileAsync(); + using var stream = await fileRef.OpenFileAsync(); // do something with the file stream return 0; @@ -50,7 +54,7 @@ public class FileUploadController : GraphController } ``` -The scalar is named `Upload` in the final graphql schema as defined in the specification. Be sure to declare your variables as an `Upload` type to indicate an uploaded file. +The scalar in your schema is named `Upload` per the specification. Be sure to declare your graphql variables as an `Upload` type to indicate an uploaded file. ```graphql title="Use the Upload graph type for variables" mutation ($file: Upload) { @@ -71,7 +75,7 @@ curl localhost:3000/graphql \ Arrays of files work just like any other list in GraphQL. When declaring the map variable for the multi-part request, be sure to indicate which index you are mapping the file to. The extension will not magically append files to an array. Each mapped file must explicitly declare the element index in an array where it is being placed. -Warning: Be sure to dispose of each file stream when you are finished with them +Warning: Be sure to dispose of each file stream when you are finished with it.

@@ -95,7 +99,7 @@ public class FileUploadController : GraphController } ``` -```graphql title="Declare a list of files on a query" +```graphql title="Declaring a list of files on a graphql query" # highlight-next-line mutation ($files: [Upload]) { multiFileUpload(file: $files) @@ -112,7 +116,7 @@ curl localhost:3000/graphql \ ``` ### Handling an Unknown Number of Files -There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long each declaration in your `map` field points to a position that could be a valid index in an array on the `operations` object, the target array will be resized accordingly. +There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long each declaration in your `map` field points to a position that _could be_ a valid index, the target array will be resized accordingly. ```bash title="Adding Two Files" curl localhost:3000/graphql \ @@ -136,7 +140,7 @@ In the above example, the `files` array will be automatically expanded to includ ``` ### Skipping Array Indexes -If you skip any indexes in your `map` declaration, the target array will be expanded to to include the out of sequence index. This can produce null values in your array and, depending on your query declaration, result in an error if your variable does not allow nulls. +If you skip any indexes in your `map` declaration, the target array will be expanded to to include the out of sequence index. This can produce null values in your array and result in an error if your variable declaration does not allow nulls. ```bash title="Adding One File To Index 5" # Only one file is supplied but its mapped to index 5 @@ -154,6 +158,7 @@ curl localhost:3000/graphql \ ```json title="Resultant Operations Object" { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + // highlight-next-line "variables": { "files": [null, null, null, null, null, ] } } ``` @@ -174,7 +179,7 @@ curl localhost:3000/graphql \ ``` ## The FileUpload Scalar -The following properties on the `FileUpload` scalar can be useful: +The following properties on the `FileUpload` C# class can be useful: * `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced. * `MapKey` - The key value used to place this file within a variable collection. This is usually the form field name on the multi-part request. @@ -195,7 +200,7 @@ public class FileUploadController : GraphController // do something with the file stream // it is your responsibility to close and dispose of it // highlight-next-line - var stream = await fileRef.OpenFileStreamAsync(); + using var stream = await fileRef.OpenFileStreamAsync(); return 0; } @@ -203,7 +208,7 @@ public class FileUploadController : GraphController ``` ## Custom File Handling -By default, this extension just splits the request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times? +By default, this extension splits the POST request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times. You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the http request. @@ -222,25 +227,24 @@ You can implement and register your own `IFileUploadScalarValueMaker` to add cus ```csharp title="Register Your Custom Value Maker" -// startup code +// other startup code omitted -// register your value maker BEFORE calling .AddGraphQL +// register your scalar value maker BEFORE calling .AddGraphQL services.AddSingleton(); services.AddGraphQL(options => { - options.RegisterExtension(); + options.AddMultipartRequestSupport(); }); ``` :::tip -You can inherit from `FileUpload` and extend it as needed. Just be sure to declare it as `FileUpload` in your controllers so that GraphQL knows -what scalar you are requesting. +You can inherit from `FileUpload` and extend it as needed on your custom maker. However, be sure to declare your method parameters as `FileUpload` in your controllers so that GraphQL knows what scalar you are requesting. ::: Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. ## Timeouts and File Uploads -Be mindful of any query timeouts you have set for your schemas. ASP.NET may start processing your query before all the file contents are made available to the server as long as it has the initial POST request. This also means that your graphql queries may start executing before the file contents arrive. +Be mindful of any query timeouts you have set for your schemas. ASP.NET may start processing your query before all the file contents are made available to the server as long as it has the initial POST request. This also means that your graphql queries may start executing before the file contents arrive. -While this asysncronicty usually works to your advantage, you may find that your queries pause on `.OpenFileStreamAsync()` waiting for the file for a long period of time if there is a network delay or a large file being uploaded. If you have a [custom timeout](../reference/schema-configuration.md#querytimeout) configured for a schema, it may trigger while waiting for the file. Be sure to set your timeouts to a long enough period of time to avoid this issue. \ No newline at end of file +While this asysncronicty usually works to your advantage, allowing your queries to begin processing before all the files are uploaded to the server; you may find that your queries pause on `.OpenFileStreamAsync()` waiting for the file stream to become available if there is a network delay or a large file being uploaded. If you have a [custom timeout](../reference/schema-configuration.md#querytimeout) configured for a schema, it may trigger while waiting for the file. Be sure to set your timeouts to a long enough period of time to avoid this scenario. \ No newline at end of file From 038b40b35a85780ae9f6ac194f4ea75c1189bd8d Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sat, 6 May 2023 17:07:33 -0700 Subject: [PATCH 6/8] WIP, consolidated file uploads and batch queries together --- .../_category_.json | 0 docs/server-extensions/multipart-requests.md | 509 ++++++++++++++++++ docs/serverExtensions/batch-processing.md | 189 ------- docs/serverExtensions/file-uploads.md | 250 --------- 4 files changed, 509 insertions(+), 439 deletions(-) rename docs/{serverExtensions => server-extensions}/_category_.json (100%) create mode 100644 docs/server-extensions/multipart-requests.md delete mode 100644 docs/serverExtensions/batch-processing.md delete mode 100644 docs/serverExtensions/file-uploads.md diff --git a/docs/serverExtensions/_category_.json b/docs/server-extensions/_category_.json similarity index 100% rename from docs/serverExtensions/_category_.json rename to docs/server-extensions/_category_.json diff --git a/docs/server-extensions/multipart-requests.md b/docs/server-extensions/multipart-requests.md new file mode 100644 index 0000000..8156fa6 --- /dev/null +++ b/docs/server-extensions/multipart-requests.md @@ -0,0 +1,509 @@ +--- +id: multipart-requests +title: Multipart Form Request Extension +sidebar_label: File Uploads & Batching +sidebar_position: 0 +--- + +.NET 6+ + +## Multipart Request Specification +GraphQL ASP.NET provides built in support for batch query processing and file uploads via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). + +:::caution +This document covers how to submit a batch query and upload files that conform to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. +::: + +## Enable The Extension + +While the multipart form extension is shipped as part of the main library it is disabled by default and must be explicitly enabled on each schema. + +```csharp title='Register the Server Extension' +// Startup Code +// other code omitted for brevity +services.AddGraphQL(options => { + options.AddMultipartRequestSupport(); +}); +``` + +## File Uploads +Files submitted on a post request are automatically routed to your controllers as a custom scalar. Out of the box, any .NET `IFormFile` and any form field not explicitly declared by the specification will be converted into a file scalar and can be mapped into your query's variables collection. + +### A Basic Controller + +Files are received as a special C# class named `FileUpload`. Use it in your action methods like you would any other scalar (e.g. int, string etc.). Note that even though it is a class, as opposed to a primitative, GraphQL still handles it as a scalar; much in the same way `Uri` is also considered a scalar. + +Warning: Be sure to dispose of the file stream when you are finished with it. +
+
+ +```csharp title=ExampleFile Upload Controller +public class FileUploadController : GraphController +{ + [MutationRoot("singleFileUpload")] + // highlight-next-line + public async Task UploadFile(FileUpload fileRef) + { + using var stream = await fileRef.OpenFileAsync(); + // do something with the file stream + + return 0; + } +} +``` + +The scalar in your schema is named `Upload` per the specification. Be sure to declare your graphql variables as an `Upload` type to indicate an uploaded file. + +```graphql title="Use the Upload graph type for variables" +mutation ($file: Upload) { + singleFileUpload(file: $file) +} +``` + +```bash title="Sample curl Query" +curl localhost:3000/graphql \ + # highlight-next-line + -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ + -F map='{ "0": ["variables.file"] }' \ + -F 0=@a.txt +``` + +### Handling Arrays of Files + +Arrays of files work just like any other list in GraphQL. When declaring the map variable for the multi-part request, be sure +to indicate which index you are mapping the file to. The extension will not magically append files to an array. Each mapped file must explicitly declare the element index in an array where it is being placed. + +Warning: Be sure to dispose of each file stream when you are finished with it. +
+
+ +```csharp title="Example File Upload Controller" +using GraphQL.AspNet.ServerExtensions.MultipartRequests; + +public class FileUploadController : GraphController +{ + [MutationRoot("multiFileUpload")] + // highlight-next-line + public async Task UploadFile(IEnumerable files) + { + foreach(var file in files) + { + using var stream = await fileRef.OpenFileAsync(); + // do something with each file stream + } + + return 0; + } +} +``` + +```graphql title="Declaring a list of files on a graphql query" +# highlight-next-line +mutation ($files: [Upload]) { + multiFileUpload(file: $files) +} +``` + +```bash title="Sample Curl" +curl localhost:3000/graphql \ +# highlight-next-line + -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }' \ + -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + +### Handling an Unknown Number of Files +There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long each declaration in your `map` field points to a position that _could be_ a valid index, the target array will be resized accordingly. + +```bash title="Adding Two Files" +curl localhost:3000/graphql \ + -F operations='{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + # highlight-next-line + "variables": { "files": [] } }' \ + -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + + +In the above example, the `files` array will be automatically expanded to include indexes 0 and 1 as requested by the `map`: + +```json title="Resultant Operations Object" +{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + "variables": { "files": [, ] } +} +``` + +### Skipping Array Indexes +If you skip any indexes in your `map` declaration, the target array will be expanded to to include the out of sequence index. This can produce null values in your array and result in an error if your variable declaration does not allow nulls. + +```bash title="Adding One File To Index 5" +# Only one file is supplied but its mapped to index 5 +# the final array at 'variables.files` will be 6 elements long with 5 null elements. +curl localhost:3000/graphql \ + -F operations='{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + # highlight-next-line + "variables": { "files": [] } }' \ + # highlight-next-line + -F map='{ "firstFile": ["variables", "files", 5] }' \ + -F firstFile=@a.txt +``` + +```json title="Resultant Operations Object" +{ + "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", + // highlight-next-line + "variables": { "files": [null, null, null, null, null, ] } +} +``` + +### File Uploads on Batched Queries +File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. + +```bash title="Sample Query" +curl localhost:3000/graphql \ + -F operations='[ + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, + { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, + ]' \ + # highlight-next-line + -F map='{ "firstFile": [0, "variables", "files", 0], "secondFile": [1, "variables", "files", 0] }' \ + -F firstFile=@a.txt + -F secondFile=@b.txt +``` + +### FileUpload Scalar +The following properties on the `FileUpload` C# class can be useful: + +* `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced. +* `MapKey` - The key value used to place this file within a variable collection. This is usually the form field name on the multi-part request. +* `ContentType` - The supplied `content-type` value sent with the file. This value will be null for non-file fields. +* `Headers` - A collection of all the headers provided with the uploaded file. This value will be null for non-file fields. + +### Opening a File Stream +When opening a file stream you need to await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. + +```csharp title=ExampleFile Upload Controller +using GraphQL.AspNet.ServerExtensions.MultipartRequests; + +public class FileUploadController : GraphController +{ + [MutationRoot("singleFileUpload")] + public async Task UploadFile(FileUpload fileRef) + { + // do something with the file stream + // it is your responsibility to close and dispose of it + // highlight-next-line + using var stream = await fileRef.OpenFileStreamAsync(); + + return 0; + } +} +``` + +### Custom File Handling +By default, this extension splits the POST request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times. + +You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the http request. + +```csharp + public interface IFileUploadScalarValueMaker +{ + // This overload is used when processing traditional files received as part of a + // multi-part form through ASP.NET's HttpContext + Task CreateFileScalar(IFormFile aspNetFile); + + // This overload is used when processing data received on a + // multi-part form field rather than as a formal file upload. + Task CreateFileScalar(string mapKey, byte[] blobData); +} +``` + + +```csharp title="Register Your Custom Value Maker" +// other startup code omitted + +// register your scalar value maker BEFORE calling .AddGraphQL +services.AddSingleton(); + +services.AddGraphQL(options => { + options.AddMultipartRequestSupport(); +}); +``` + +:::tip +You can inherit from `FileUpload` and extend it as needed on your custom maker. However, be sure to declare your method parameters as `FileUpload` in your controllers so that GraphQL knows what scalar you are requesting. +::: + +Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. + +### Timeouts and File Uploads + +Be mindful of any query timeouts you have set for your schemas. ASP.NET may start processing your query before all the file contents are made available to the server as long as it has the initial POST request. This also means that your graphql queries may start executing before the file contents arrive. + +While this asysncronicty usually works to your advantage, allowing your queries to begin processing before all the files are uploaded to the server; you may find that your queries pause on `.OpenFileStreamAsync()` waiting for the file stream to become available if there is a network delay or a large file being uploaded. If you have a [custom timeout](../reference/schema-configuration.md#querytimeout) configured for a schema, it may trigger while waiting for the file. Be sure to set your timeouts to a long enough period of time to avoid this scenario. + + +## Batch Queries + +### Processing a Batch of Queries +Provide an "operations" form field that represents an array of graphql requests, the engine will automatically detect the array and return an array of responses in the same order as they were received. Each query is processed asyncronously and independently. + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-start + -F operations='[ + { "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }, + { "query": "query { findUser(lastName: \"Jones\") {firstName lastName} }" }, + ]' \ + # highlight-end +``` + +```json title="Example Json Serialized Response" +[ + { + "data": { + "findUser": { + "firstName": "Baily", + "lastName": "Smith" + } + } + }, + { + "data": { + "findUser": { + "firstName": "Caleb", + "lastName": "Jones" + } + } + } +] +``` + +### Processing a Single Query +Provide an "operations" form field that represents a single query and the engine will automatically detect and return a normal graphql response. + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-next-line + -F operations='{ "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }' \ +``` + +```json title="Example Json Serialized Response" +{ + "data": { + "findUser": { + "firstName": "Baily", + "lastName": "Smith" + } + } +} +``` + +:::info +The extension is backwards compatible with standard graphql http request processing. If a request is recieved that is not a multi-part form POST request, normal graphql processing will occur. +::: + +### Batch Execution Order is Never Guaranteed +While the order of the results is guaranteed to be the same order in which the queries were received, there is no guarantee that the queries are executed in any specific order. This means if you submit a batch of 5 requests, each requests may complete in a randomized order. If the same batch is submitted 3 times, its possible that the execution order will be different each time. + +For queries this is usally not an issue, but if you are batching mutations, make sure you don't have any unexpected dependencies or side effects between queries. If your controllers perform business logic against an existing object and that object is modified by more than of your mutations its highly possible that the state of the object may be unexpectedly modified in some executions but not in others. + +Take this controller and query: +```csharp title="Example Controller" +public class FileUploadController : GraphController +{ + [MutationRoot("addMoney")] + // highlight-next-line + public async Task AddMoney(int itemId, int dollarsToAdd) + { + var item =await _service.RetrieveItem(itemId); + item.CurrentTotal += dollarsToAdd; + + await _service.UpdateItem(item); + return item; + } +} +``` + +```bash title="Example Batch Query" +curl localhost:3000/graphql \ + #highlight-start + -F operations='[ + { "query": "mutation { addMoney(itemId: 34, dollarsToAdd: 5) {id currentTotal} }" }, + { "query": "mutation { addThreeDollars(itemId: 34, , dollarsToAdd: 3) {id currentTotal} }" }, + ]' \ + # highlight-end +``` + +Assuming that the initial value of `currentTotal` was 0, all three of these responses are equally likely to occur depending on the order in which the execution engine decides to process the queries. +```json title=Sample Json Results +// When the queries are executed in declared order +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 5 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 8 + } + } + }, +] + +// When the queries are executed in reverse order +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 8 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 3 + } + } + }, +] + +// When the queries are executed simultaniously +// The final result updated to the datastore is unknown +[ + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 5 + } + } + }, + { + "data": { + "addMoney": { + "id": 34, + "currentTotal": 3 + } + } + }, +] +``` + +Under the hood, the batch process will parse and submit all queries to the engine simultaniously and wait for them to finish before structuring a result object. +:::caution +Ensure there are no dependencies between queries in a batch. An expected order of execution is never guaranteed. +::: + + +## Configuration +There are several configuration settings specific to extension. They can all be toggled when the extension is registered. Each configuration is specific +to the targeted schema. + +```csharp title='Configuring the Server Extension' +// Startup Code +// other code omitted for brevity +services.AddGraphQL(options => { + // highlight-start + options.AddMultipartRequestSupport(mpOptions => { + // set mpOptions here + }); + // highlight-end +}); +``` + +### MapMode +```csharp +// usage example +mpOptions.MapMode = MultipartRequestMapHandlingMode.Default; +``` + +A bitwise flag enumeration allowing the inclusion of different types of values for the `map` field dictated by the specification. Both options are enabled by default. + +| Option | Description | +| ------------- | ----------------- | +| `AllowStringPaths` | When enabled, the short-hand syntax for `object-path`, which uses a dot-delimited string instead of an array to indicate a json path, is acceptable for a map value. | +| `SplitDotDelimitedSingleElementArrays` | When enabled, the extension will examine single element arrays and, if that element is a string, treat it as a string, allowing it to be split as a dot-delimited string if the option is enabled. When not enabled, single element arrays are treated as a single path value. | + +### RequestMode + +```csharp +// usage example +mpOptions.RequestMode = MultipartRequestMode.Default; +``` + +A bitwise flag enumeration that controls which actions the multi-part request extension. By default, both batch queries and file uploads are enabled. + +| Option | Description | +| ------------- | ----------------- | +| `FileUploads` | When enabled, the server extension will process file uploads. When disabled, any included files or form fields treated as files will cause the request to be rejected. | +| `BatchQueries` | When enabled, the extension will attempt to process properly formatted batch queries. When disabled, any attempt to submit a batch query will cause the request to be rejected. | + +### MaxFileCount + +```csharp +// usage example +mpOptions.MaxFileCount = 15; +``` + +| Default Value | Acceptable Values | +| ------------- | ----------------- | +| `null` | `null`, number | + +When set, the extension will process, at most, the indicated amount of files. If more files appear on the request than the value indicated the request is automatically rejected. By default this value is set to `null` or no limit. + +### MaxBlobCount + +```csharp +// usage example +mpOptions.MaxBlobCount = 15; +``` + +| Default Value | Acceptable Values | +| ------------- | ----------------- | +| `null` | `null`, number | + +When set, the extension will process, at most, the indicated amount of additional, non-spec form fields (e.g. additional text blobs). If more blobs appear on the request than the value indicated the request is automatically rejected. By default this value is set to `null` or no limit. + +### RegisterMultipartRequestHttpProcessor + +```csharp +// usage example +mpOptions.RegisterMultipartRequestHttpProcessor = true; +``` + +| Default Value | Acceptable Values | +| ------------- | ----------------- | +| `true` | `true`, `false` | + +Determines if, when registering the extension, the default multipart http processor is registered. When set to true, the extension will attempt to replace any other registered http processor(i.e. the object that is handed an HttpContext via a route). When false, no processor is registered. You are expected to provide your own handling for multipart requests. The extension will always register its other required objects (the form parser, the custom scalar etc.). + + + + +### RequireMultipartRequestHttpProcessor + +```csharp +// usage example +mpOptions.RequireMultipartRequestHttpProcessor = true; +``` + +| Default Value | Acceptable Values | +| ------------- | ----------------- | +| `true` | `true`, `false` | + +Determines if, when starting up the application, the extension will check that the required http processor is registered. When set to true, if the required processor is not registered and configuration exception will be thrown and the server will fail to start. This can be helpful when registering multiple extensions to ensure that a valid processor is registered such that multipart form requests will be handled correctly. \ No newline at end of file diff --git a/docs/serverExtensions/batch-processing.md b/docs/serverExtensions/batch-processing.md deleted file mode 100644 index 1c710b1..0000000 --- a/docs/serverExtensions/batch-processing.md +++ /dev/null @@ -1,189 +0,0 @@ ---- -id: batch-processing -title: Batch Query Processing -sidebar_label: Batch Processing -sidebar_position: 1 ---- -.NET 6+ - -## GraphQL Multipart Request Specification -GraphQL ASP.NET provides built in support for batch query processing via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). - -:::caution -This document covers how to submit a batch query that conforms to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. -::: - -## Enable Batch Query Support - -While batch query support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each individual schema. - -```csharp title='Register the Server Extension' -// Startup Code -// other code omitted for brevity -services.AddGraphQL(options => { - options.AddMultipartRequestSupport(); -}); -``` - - -:::tip -Batch query processing and [file uploads](./file-uploads.md) are implemented as part of the same specification and therefore are encapsulated in the same extension. -::: - -## Processing a Single Query -Provide an "operations" form field that represents a single query and the engine will automatically detect and return a normal graphql response. - -```bash title="Example Batch Query" -curl localhost:3000/graphql \ - #highlight-next-line - -F operations='{ "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }' \ -``` - -```json title="Example Json Serialized Response" -{ - "data": { - "findUser": { - "firstName": "Baily", - "lastName": "Smith" - } - } -} -``` - -:::info -The extension is backwards compatible with standard graphql http request processing. If a request is recieved that is not a multi-part form POST request, normal graphql processing will occur. -::: - -## Processing a Batch of Queries -Provide an "operations" form field that represents an array of graphql requests, the engine will automatically detect the array and return an array of responses in the same order as they were received. Each query is processed asyncronously and independently. - -```bash title="Example Batch Query" -curl localhost:3000/graphql \ - #highlight-start - -F operations='[ - { "query": "query { findUser(lastName: \"Smith\") {firstName lastName} }" }, - { "query": "query { findUser(lastName: \"Jones\") {firstName lastName} }" }, - ]' \ - # highlight-end -``` - -```json title="Example Json Serialized Response" -[ - { - "data": { - "findUser": { - "firstName": "Baily", - "lastName": "Smith" - } - } - }, - { - "data": { - "findUser": { - "firstName": "Caleb", - "lastName": "Jones" - } - } - } -] -``` -## Batch Execution Order is Never Guaranteed -While the order of the results is guaranteed to be the same order in which the queries were received, there is no guarantee that the queries are executed in any specific order. This means if you submit a batch of 5 requests, each requests may complete in a randomized order. If the same batch is submitted 3 times, its possible that the execution order will be different each time. - -For queries this is usally not an issue, but if you are batching mutations, make sure you don't have any unexpected dependencies or side effects between queries. If your controllers perform business logic against an existing object and that object is modified by more than of your mutations its highly possible that the state of the object may be unexpectedly modified in some executions but not in others. - -Take this controller and query: -```csharp title="Example Controller" -public class FileUploadController : GraphController -{ - [MutationRoot("addMoney")] - // highlight-next-line - public async Task AddMoney(int itemId, int dollarsToAdd) - { - var item =await _service.RetrieveItem(itemId); - item.CurrentTotal += dollarsToAdd; - - await _service.UpdateItem(item); - return item; - } -} -``` - -```bash title="Example Batch Query" -curl localhost:3000/graphql \ - #highlight-start - -F operations='[ - { "query": "mutation { addMoney(itemId: 34, dollarsToAdd: 5) {id currentTotal} }" }, - { "query": "mutation { addThreeDollars(itemId: 34, , dollarsToAdd: 3) {id currentTotal} }" }, - ]' \ - # highlight-end -``` - -Assuming that the initial value of `currentTotal` was 0, all three of these responses are equally likely to occur depending on the order in which the execution engine decides to process the queries. -```json title=Sample Json Results -// When the queries are executed in declared order -[ - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 5 - } - } - }, - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 8 - } - } - }, -] - -// When the queries are executed in reverse order -[ - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 8 - } - } - }, - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 3 - } - } - }, -] - -// When the queries are executed simultaniously -// The final result updated to the datastore is unknown -[ - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 5 - } - } - }, - { - "data": { - "addMoney": { - "id": 34, - "currentTotal": 3 - } - } - }, -] -``` - -Under the hood, the batch process will parse and submit all queries to the engine simultaniously and wait for them to finish before structuring a result object. -:::caution -Ensure there are no dependencies between queries in a batch. An expected order of execution is never guaranteed. -::: \ No newline at end of file diff --git a/docs/serverExtensions/file-uploads.md b/docs/serverExtensions/file-uploads.md deleted file mode 100644 index 7a08e36..0000000 --- a/docs/serverExtensions/file-uploads.md +++ /dev/null @@ -1,250 +0,0 @@ ---- -id: file-uploads -title: File Uploads -sidebar_label: File Uploads -sidebar_position: 0 ---- -.NET 6+ - -## GraphQL Multipart Request Specification -GraphQL ASP.NET provides built in support for file uploads via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). - -:::caution -This document covers how to setup a controller to accept files from an http request that conforms to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. -::: - -## Enable File Upload Support - -While file upload support is shipped as part of the main library it is disabled by default and must be explicitly enabled as an extension to each schema. - -```csharp title='Register the Server Extension' -// Startup Code -// other code omitted for brevity -services.AddGraphQL(options => { - options.AddMultipartRequestSupport(); -}); -``` - -:::tip -File uploads and [batch query processing](./batch-processing.md) are implemented as part of the same specification and are encapsulated in the same "multi-part request" extension. -::: - - -## A Basic Controller - -Files are received as a special C# class named `FileUpload`. Add a reference in your controller to this -scalar like you would any other scalar. - -Warning: Be sure to dispose of the file stream when you are finished with it. -
-
- -```csharp title=ExampleFile Upload Controller -public class FileUploadController : GraphController -{ - [MutationRoot("singleFileUpload")] - // highlight-next-line - public async Task UploadFile(FileUpload fileRef) - { - using var stream = await fileRef.OpenFileAsync(); - // do something with the file stream - - return 0; - } -} -``` - -The scalar in your schema is named `Upload` per the specification. Be sure to declare your graphql variables as an `Upload` type to indicate an uploaded file. - -```graphql title="Use the Upload graph type for variables" -mutation ($file: Upload) { - singleFileUpload(file: $file) -} -``` - -```bash title="Sample curl Query" -curl localhost:3000/graphql \ - # highlight-next-line - -F operations='{ "query": "mutation ($file: Upload) { singleFileUpload(file: $file) }", "variables": { "file": null } }' \ - -F map='{ "0": ["variables.file"] }' \ - -F 0=@a.txt -``` - -## Handling Arrays of Files - -Arrays of files work just like any other list in GraphQL. When declaring the map variable for the multi-part request, be sure -to indicate which index you are mapping the file to. The extension will not magically append files to an array. Each mapped file must explicitly declare the element index in an array where it is being placed. - -Warning: Be sure to dispose of each file stream when you are finished with it. -
-
- -```csharp title="Example File Upload Controller" -using GraphQL.AspNet.ServerExtensions.MultipartRequests; - -public class FileUploadController : GraphController -{ - [MutationRoot("multiFileUpload")] - // highlight-next-line - public async Task UploadFile(IEnumerable files) - { - foreach(var file in files) - { - using var stream = await fileRef.OpenFileAsync(); - // do something with each file stream - } - - return 0; - } -} -``` - -```graphql title="Declaring a list of files on a graphql query" -# highlight-next-line -mutation ($files: [Upload]) { - multiFileUpload(file: $files) -} -``` - -```bash title="Sample Curl" -curl localhost:3000/graphql \ -# highlight-next-line - -F operations='{ "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [null, null] } }' \ - -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ - -F firstFile=@a.txt - -F secondFile=@b.txt -``` - -### Handling an Unknown Number of Files -There are scenarios where you may ask your users to select a few files to upload without knowing how many they might choose. As long each declaration in your `map` field points to a position that _could be_ a valid index, the target array will be resized accordingly. - -```bash title="Adding Two Files" -curl localhost:3000/graphql \ - -F operations='{ - "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", - # highlight-next-line - "variables": { "files": [] } }' \ - -F map='{ "firstFile": ["variables", "files", 0], "secondFile": ["variables", "files", 1] }' \ - -F firstFile=@a.txt - -F secondFile=@b.txt -``` - - -In the above example, the `files` array will be automatically expanded to include indexes 0 and 1 as requested by the `map`: - -```json title="Resultant Operations Object" -{ - "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", - "variables": { "files": [, ] } -} -``` - -### Skipping Array Indexes -If you skip any indexes in your `map` declaration, the target array will be expanded to to include the out of sequence index. This can produce null values in your array and result in an error if your variable declaration does not allow nulls. - -```bash title="Adding One File To Index 5" -# Only one file is supplied but its mapped to index 5 -# the final array at 'variables.files` will be 6 elements long with 5 null elements. -curl localhost:3000/graphql \ - -F operations='{ - "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", - # highlight-next-line - "variables": { "files": [] } }' \ - # highlight-next-line - -F map='{ "firstFile": ["variables", "files", 5] }' \ - -F firstFile=@a.txt -``` - -```json title="Resultant Operations Object" -{ - "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", - // highlight-next-line - "variables": { "files": [null, null, null, null, null, ] } -} -``` - -## File Uploads on Batched Queries -File uploads work in conjunction with batched queries. When processing a multi-part request as a batch, prefix each of the mapped object-path references with an index of the batch you want the file to apply to. As you might guess this is usually handled by a supported client automatically. - -```bash title="Sample Query" -curl localhost:3000/graphql \ - -F operations='[ - { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, - { "query": "mutation ($files: [Upload]) { multiFileUpload(files: $files) }", "variables": { "files": [] } }, - ]' \ - # highlight-next-line - -F map='{ "firstFile": [0, "variables", "files", 0], "secondFile": [1, "variables", "files", 0] }' \ - -F firstFile=@a.txt - -F secondFile=@b.txt -``` - -## The FileUpload Scalar -The following properties on the `FileUpload` C# class can be useful: - -* `FileName` - The name of the file that was uploaded. This property will be null if a non-file form field is referenced. -* `MapKey` - The key value used to place this file within a variable collection. This is usually the form field name on the multi-part request. -* `ContentType` - The supplied `content-type` value sent with the file. This value will be null for non-file fields. -* `Headers` - A collection of all the headers provided with the uploaded file. This value will be null for non-file fields. - -## Opening a File Stream -When opening a file stream you need to await a call `FileUpload.OpenFileAsync()`. This method is an abstraction on top of an internal wrapper that standardizes file streams across all implementions (see below for implementing your own file processor). When working with the standard `IFormFile` interface provided by ASP.NET this call is a simple wrapper for `IFormFile.OpenReadStream()`. - -```csharp title=ExampleFile Upload Controller -using GraphQL.AspNet.ServerExtensions.MultipartRequests; - -public class FileUploadController : GraphController -{ - [MutationRoot("singleFileUpload")] - public async Task UploadFile(FileUpload fileRef) - { - // do something with the file stream - // it is your responsibility to close and dispose of it - // highlight-next-line - using var stream = await fileRef.OpenFileStreamAsync(); - - return 0; - } -} -``` - -## Custom File Handling -By default, this extension splits the POST request on an `HttpContext` and presents the different parts to the query engine in a manner it expects. This means that any uploaded files are consumed under the hood as ASP.NET's built in `IFormFile` interface. While this is fine for most users, it can be troublesome with regard to timeouts and large file requests. Also, there may be scenarios where you want to save off files prior to executing a query or perhaps you'll need to process the file stream multiple times. - -You can implement and register your own `IFileUploadScalarValueMaker` to add custom processing logic for each file or blob BEFORE graphql gets ahold of it. For instance, some users may want to write incoming files to local disk or cloud storage and present GraphQL with a stream that points to that local reference, rather than the file reference on the http request. - -```csharp - public interface IFileUploadScalarValueMaker -{ - // This overload is used when processing traditional files received as part of a - // multi-part form through ASP.NET's HttpContext - Task CreateFileScalar(IFormFile aspNetFile); - - // This overload is used when processing data received on a - // multi-part form field rather than as a formal file upload. - Task CreateFileScalar(string mapKey, byte[] blobData); -} -``` - - -```csharp title="Register Your Custom Value Maker" -// other startup code omitted - -// register your scalar value maker BEFORE calling .AddGraphQL -services.AddSingleton(); - -services.AddGraphQL(options => { - options.AddMultipartRequestSupport(); -}); -``` - -:::tip -You can inherit from `FileUpload` and extend it as needed on your custom maker. However, be sure to declare your method parameters as `FileUpload` in your controllers so that GraphQL knows what scalar you are requesting. -::: - -Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. - -## Timeouts and File Uploads - -Be mindful of any query timeouts you have set for your schemas. ASP.NET may start processing your query before all the file contents are made available to the server as long as it has the initial POST request. This also means that your graphql queries may start executing before the file contents arrive. - -While this asysncronicty usually works to your advantage, allowing your queries to begin processing before all the files are uploaded to the server; you may find that your queries pause on `.OpenFileStreamAsync()` waiting for the file stream to become available if there is a network delay or a large file being uploaded. If you have a [custom timeout](../reference/schema-configuration.md#querytimeout) configured for a schema, it may trigger while waiting for the file. Be sure to set your timeouts to a long enough period of time to avoid this scenario. \ No newline at end of file From 82d56ab6a903c4e7e89dde7f275a7ef5b588620d Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sun, 7 May 2023 11:25:24 -0700 Subject: [PATCH 7/8] WIP, fixed link with scalar maker --- docs/server-extensions/multipart-requests.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/server-extensions/multipart-requests.md b/docs/server-extensions/multipart-requests.md index 8156fa6..17bbcd2 100644 --- a/docs/server-extensions/multipart-requests.md +++ b/docs/server-extensions/multipart-requests.md @@ -239,7 +239,7 @@ services.AddGraphQL(options => { You can inherit from `FileUpload` and extend it as needed on your custom maker. However, be sure to declare your method parameters as `FileUpload` in your controllers so that GraphQL knows what scalar you are requesting. ::: -Take a look at the [default upload scalar value maker](https://google.com) for some helpful details when trying to implement your own. +Take a look at the [default upload scalar value maker](https://github.com/graphql-aspnet/graphql-aspnet/blob/master/src/graphql-aspnet/ServerExtensions/MultipartRequests/Engine/TypeMakers/DefaultFileUploadScalarValueMaker.cs) for some helpful details when trying to implement your own. ### Timeouts and File Uploads From 281dcc31279c1512e215e810279d2452e1e3a166 Mon Sep 17 00:00:00 2001 From: Kevin Carroll Date: Sun, 7 May 2023 11:45:15 -0700 Subject: [PATCH 8/8] WIP, updated code references added demo project --- docs/reference/demo-projects.md | 8 ++++++++ docs/server-extensions/multipart-requests.md | 9 +++++++-- 2 files changed, 15 insertions(+), 2 deletions(-) diff --git a/docs/reference/demo-projects.md b/docs/reference/demo-projects.md index 1701fdf..a565c69 100644 --- a/docs/reference/demo-projects.md +++ b/docs/reference/demo-projects.md @@ -37,3 +37,11 @@ Demonstrates the use of an external subscription event publisher and a consumer 📌 [Subscriptions w/ React & Apollo Client](https://github.com/graphql-aspnet/demo-projects/tree/master/Subscriptions-ReactApolloClient)
A sample react application that makes use of the [apollo client](https://www.apollographql.com/docs/react/) to connect to a GraphQL ASP.NET server. + +
+ +### Extensions + +📌 [File Uploads](https://github.com/graphql-aspnet/demo-projects/tree/master/File-Uploads)
+Demonstrates the use the [graphql-multipart-form-spec](https://github.com/jaydenseric/graphql-multipart-request-spec) compliant extension to perform file uploads as part of a graphql query. + diff --git a/docs/server-extensions/multipart-requests.md b/docs/server-extensions/multipart-requests.md index 17bbcd2..89ebba3 100644 --- a/docs/server-extensions/multipart-requests.md +++ b/docs/server-extensions/multipart-requests.md @@ -10,7 +10,9 @@ sidebar_position: 0 ## Multipart Request Specification GraphQL ASP.NET provides built in support for batch query processing and file uploads via an implementation of the [GraphQL Multipart Request Specification](https://github.com/jaydenseric/graphql-multipart-request-spec). -:::caution +This extension requires a minimum version of `v1.2.0` of the main library and you must target .NET 6 or later. This extension will not work with the .NET standard implementation. + +:::info This document covers how to submit a batch query and upload files that conform to the above specification. It provides sample curl requests that would be accepted for the given sample code but does not explain in detail the various form fields required to complete a request. It is highly recommended to use a [supported client](https://github.com/jaydenseric/graphql-multipart-request-spec#client) when enabling this server extension. ::: @@ -506,4 +508,7 @@ mpOptions.RequireMultipartRequestHttpProcessor = true; | ------------- | ----------------- | | `true` | `true`, `false` | -Determines if, when starting up the application, the extension will check that the required http processor is registered. When set to true, if the required processor is not registered and configuration exception will be thrown and the server will fail to start. This can be helpful when registering multiple extensions to ensure that a valid processor is registered such that multipart form requests will be handled correctly. \ No newline at end of file +Determines if, when starting up the application, the extension will check that the required http processor is registered. When set to true, if the required processor is not registered and configuration exception will be thrown and the server will fail to start. This can be helpful when registering multiple extensions to ensure that a valid processor is registered such that multipart form requests will be handled correctly. + +## Demo Project +See the [demo projects](../reference/demo-projects.md) for a sample project utilizing [jaydenseric's apollo-upload-client](https://github.com/jaydenseric/apollo-upload-client) as a front end for performing file uploads against this extension. \ No newline at end of file