Description
Hi there,
Inspired by swagger-api/swagger-codegen#4883.
While trying to introduce Swagger to our project, we ran into a certain issue.
When processing inline models, the resolver makes certain optimizations which cause some of the models down the line to be reused thus skipping the generation of unique model classes. It would help us a lot to have the ability to toggle the match skipping while resolving the inline models.
It seems to me that inline models are being treated as second-class citizens by the OpenAPI specification (and they probably should be). I do agree that $ref
and model extraction contribute greatly towards reuse but it's only feasible to do it if:
- you design the API from scratch;
- or this API is relatively small;
- and you own the OpenAPI document.
Now we depend on a legacy system which has a number of operations. The developers were kind enough to generate an OpenAPI document for us so that we could easily integrate via REST. Yes, it would be nice to extract and reuse all the necessary models but it is not feasible to assume that someone is going to analyze these APIs now which accumulated over the years. In this case we're better off having neat isolated operations even if this causes certain duplication and overhead.
Reasoning:
- I find the default behavior quite confusing with regards to all the optimizations and flattening. I prefer my operations isolated and independent, even if this leads to a greater number of generated classes. Indeed, tastes differ but I'm pretty sure that a substantial number of developers would like to go my way. Personally, I would adjust this setting to generate unique classes by default but this will surely break a lot of workflows (@wing328 also mentioned in [Java] doesnt generate correct classes for models with same nested properties swagger-codegen#4883 that there was no consensus on the "correct" behavior). However, I think that there should at least the ability to tune the resolver to the needs of user, especially considering that the setter for this flag has already been exposed.
- Consider the following OpenAPI document:
openapi: "3.0.0"
paths:
/operationAlpha:
get:
requestBody:
content:
application/json:
schema:
type: "object"
title: "operationAlphaRequest"
properties:
in:
type: "object"
title: "operationAlphaIn"
properties:
asset:
type: "object"
title: "operationAlphaInAsset"
properties:
id1:
type: "integer"
format: "int32"
id2:
type: "integer"
format: "int32"
/operationBeta:
get:
requestBody:
content:
application/json:
schema:
type: "object"
title: "operationBetaRequest"
properties:
in:
type: "object"
title: "operationBetaIn"
properties:
asset:
type: "object"
title: "operationBetaInAsset"
properties:
id1:
type: "integer"
format: "int32"
id2:
type: "integer"
format: "int32"
The current approach causes just four model classes to be generated (operationAlphaRequest
, operationAlphaIn
, operationAlphaInAsset
, operationBetaRequest
). Thus the developer is forced to code against operationAlphaIn
and operationAlphaInAsset
in operationBeta
.
Now imagine someone decides to expand the operationAlphaInAsset
to include an additional field. In my understanding this shouldn't be a breaking change. However, since the definitions are no longer the same, the resolver causes all six model classes to be generated, causing a compilation error since operationBeta
now depends on operationBetaIn
and operationBetaInAsset
, not operationAlphaIn
and operationAlphaInAsset
as before. This alone seems like a good reason to at least give the ability to toggle this setting.
3. Readability. It is quite confusing to code against operationBeta
in operationAlpha
terms. I've just introduced Swagger to our project and while the generated classes are going to make our lives a lot easier, I'm going to have a hard time explaining to my coworkers why there's a certain confusion with class names.
I also prepared the changes for the swagger-codegen-maven-plugin
to expose this setting but I'd rather hold them until I get a green light here (as these are completely dependent on swagger-parser
).
Thanks.