Asana / python-asana

Official Python client library for the Asana API v1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Generate models

mark-thm opened this issue · comments

It would be much more convenient as a consumer of the API to have generated model classes to make it clear which fields to populate in requests and which fields are available in responses.

It looks like model generation is explicitly suppressed here:

// Remove model template files so that the generator does not generate models
modelTemplateFiles.remove("model.mustache");
modelTestTemplateFiles.remove("model_test.mustache");
modelDocTemplateFiles.remove("model_doc.mustache");

Would you reconsider?

Hi @mark-thm, we have a reason for not generating the models. We experimented with generating models in python-asana v4.X.X (EX: v4.0.11) and discovered many issues with models and our API. Here are some issues we've discovered while trying to make this work with models.

1: Created confusion with developers
We use a lot of references in our OpenAPI Spec which resulted in a lot of models being generated that confused people (e.g., AllOf...). See Documentation For Models in v4.0.11.

2: Response models generated could not handle our API's GraphQL behavior
By default endpoints that return an array of resources typically returns a compact version of that resource. EX: Get multiple tasks returns an array of TaskCompact items. The Asana API has a feature called opt_fields which allows users to request for extra properties in the response for that endpoint.

Scenario:

  1. User makes a call to Get multiple tasks endpoint and asks for the likes and html_notes properties in the opt_fields query parameter. NOTE: that these properties are not returned by default and do not exist in the TaskCompact schema
  2. Python client library makes the user's request
  3. Python client library receives response with these extra properties
  4. Python client library tries to map these responses to the TaskCompact model
  5. Python client library returns an error because these properties don't existing in the TaskCompact model

We tried get around this issue by modifying our OpenAPI Spec to say that endpoints that return multiple values return an array of full schema resource instead. This solved the issue with the client library not knowing which properties to map to. The issue now is users are able to access properties from that model and by default for values that are not returned from the API those properties are set to None. Imagine a use case where a developer relies on a property from the schema but forgot to ask for it in opt_fields query parameters. In this scenario the result would always be None but perhaps that value is not actually None if requested from the API.

3: Asking for nested opt_fields
Using our opt_fields query param you can make a request that can jump across multiple resources. Let's say a response model is TaskResponse -> This contains a assignee property which is represented by a UserCompact model. If the developer asks for a property that is not within the UserCompact model of the TaskResponse model than the client library throws an error. EX: assignee.workspaces

We could fix this by modifying our OAS to convert all the nested compact models into full schema models but than we run into the same issue as 2. Additionally, because we can keep going with opt_fields there could be a potential loop with requesting properties in opt_fields.

4: Representing dynamic keys in query params
Our Search tasks in a workspace endpoint has a dynamic query param key name (Custom field parameters) that we could not define in our OpenAPI Spec as it is not a normal API behavior. Even if there was a convention to define this in our OpenAPI Spec we doubt that the generator understands this type of logic. This results in the client library not understanding what the dynamic keys are and how to handle them.

5: Handling dynamic responses
Our Get multiple memberships returns an array of resources of type X depending on the search criteria and the client library has to handle deciding which resources to map the response to.

Ok, thanks for such a thorough response. We took the update to 4.0.x before it got yanked and are a little stuck because the generated models worked for our admittedly very limited use and their absence means we have to fight mypy a fair bit to make sense of things.