You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to serialize message bodies when generating examples for request/response body from a data structure, we can provide numerous adapters for serializing content. For example serializing a data structure element into JSON, multipart form, JSON Schema or MSON etc.
Components:
Serialize should return a parse result. They should be able to emit warnings and errors. This is a problem in the current design, for example serialising API Blueprint or OpenAPI can result in loss of information which isn't clear to the end user during conversion. Loss of information should create warnings, and we should have source map information where possible.
"This line from my OAS 2 document cannot be serialized into API Blueprint because X".
Other parsers such as OAS 2 and OAS 3 should make use of the registered serializers when generating message bodies (where generateMessageBody option is enabled). The parser consumer may load JSON if they want to generate JSON bodies, or custom adapters such as yaml, messagepack, protobuf etc.
Extra note: "Console" functionality in documentation can provide a UI for entering the data structures, headers, parameters which can update value in a copy of the data structure elements. It can use these adapters to generate the message body so it does not have to duplicate the efforts for each content-type.
Here's a rough idea of how this might look and how you can use these
adapters:
The above interface expects that the input element is frozen so it can traverse up to the parents to resolve any references. We likely want to be able to provide a cache so that there is no duplicate efforts when resolving numerous
structures which contain shared references in the same document. Possible you can register a cache with the namespace, perhaps something like:
Adapters will be able to detect cache and utilise them to prevent redundant expensive work during serializing recursive structures:
// namespace's serialize functionasyncfunctionserialize({ input, mediaType }){if(id(for: input)&&cache&&cachehascacheforid(for: input)+mediaType){returncachedresult;}// dispatch to an adapter to handle serializereturnawaitfind('serialize',mediaType)(input);}asyncfunctionserialize({ input, namespace }){// exit early if found in cacheconstresult=namespace.cache(forSerializer: 'application/json',input.id);if(result){returnresult;}// mayor fixed pseudo/imperitive code for demo purposes: recursively go over result// assuming input is object elementconstobject={};input.forEach((value,key)=>{// recurse serialize, it will handle cache of the value if it has ID /// resolving referencesconstv=awaitnamespace.serialize({input: value, mediaType });// note in this example v is actually stringified JSON, we should design a// way around that else we'll double encodeobject[key.toValue]=v;});returnJSON.stringify(object);}
The text was updated successfully, but these errors were encountered:
In order to serialize message bodies when generating examples for request/response body from a data structure, we can provide numerous adapters for serializing content. For example serializing a data structure element into JSON, multipart form, JSON Schema or MSON etc.
Components:
Serialize should return a parse result. They should be able to emit warnings and errors. This is a problem in the current design, for example serialising API Blueprint or OpenAPI can result in loss of information which isn't clear to the end user during conversion. Loss of information should create warnings, and we should have source map information where possible.
"This line from my OAS 2 document cannot be serialized into API Blueprint because X".
Other parsers such as OAS 2 and OAS 3 should make use of the registered serializers when generating message bodies (where generateMessageBody option is enabled). The parser consumer may load JSON if they want to generate JSON bodies, or custom adapters such as yaml, messagepack, protobuf etc.
Extra note: "Console" functionality in documentation can provide a UI for entering the data structures, headers, parameters which can update value in a copy of the data structure elements. It can use these adapters to generate the message body so it does not have to duplicate the efforts for each content-type.
Here's a rough idea of how this might look and how you can use these
adapters:
The above interface expects that the input element is frozen so it can traverse up to the parents to resolve any references. We likely want to be able to provide a cache so that there is no duplicate efforts when resolving numerous
structures which contain shared references in the same document. Possible you can register a cache with the namespace, perhaps something like:
Adapters will be able to detect cache and utilise them to prevent redundant expensive work during serializing recursive structures:
The text was updated successfully, but these errors were encountered: