-
-
Notifications
You must be signed in to change notification settings - Fork 168
feat: add python type generator #1011
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
new class is needed because typeddict uses NonRequired for missing attributes
Pull Request Test Coverage Report for Build 19441158062Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. In terms of release, we could make a canary release from this PR, setup some client-side tests (even if basics just to ensure no regressions and working types e2e) on the supabase-py repo.
(A bit like it's done for the TS types here: https://github.com/supabase/supabase-js/blob/master/packages/core/postgrest-js/test/resource-embedding.test.ts#L133-L152)
Idea would be to use the canary image to generate types for a test database, and have some basics runtime + types testing to see if it work as expected before getting this merged to production. WDYT ?
src/server/templates/python.ts
Outdated
| const py_tables = tables | ||
| .filter((table) => schemas.some((schema) => schema.name === table.schema)) | ||
| .flatMap((table) => { | ||
| const py_class_and_methods = ctx.tableToClass(table) | ||
| return py_class_and_methods | ||
| }) | ||
| const composite_types = types | ||
| .filter((type) => type.attributes.length > 0) | ||
| .map((type) => ctx.typeToClass(type)) | ||
| const py_views = views.map((view) => ctx.viewToClass(view)) | ||
| const py_matviews = materializedViews.map((matview) => ctx.matViewToClass(matview)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick
Tables on schemas that are excluded should also be excluded by introspection (not sure the filter is in need there).
If it isn't then in such case I guess you also want to apply the same filter for views and matviews ? (if they're declared on a schema not in the list then don't generate types for them).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@avallete can you elaborate more on this because the typescript version of this code seems to output every schema. I think this should be fixed in the typescript version first as PostgREST doesn't give access to non-exposed schemas like auth, storage, extensions, net by default and these should never be accessed through PostgREST which is what these types are generated for.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@avallete can you elaborate more on this because the typescript version of this code seems to output every schema.
Sorry wasn't clear.
It does, and the schemas included/excluded will be filtered at the introspection level (getGeneratorMetadata).
My question was about why there is this check:
const py_tables = tables
.filter((table) => schemas.some((schema) => schema.name === table.schema))Maybe I've misunderstood but it seems like this is supposed to ensure that the table.schema is matching one of the included value in the included schemas. But this should already have been taken cares of at the introspection level (if a table is not within an included schema it won't be introspected / present in the list).
|
After a small modification to the |
|
@o-santi please remember to add the necessary changes in the package.json |
What is the current behavior?
Adds python type generation support, piggy backing off of @ryanpeach's #808 pull request. The main difference between his approach is that I added a lot more structure by first creating the "AST", then serializing it through the
serialize()method, instead of in place constructing strings everywhere.Fixes #795.
Additional context
The idea is that the normal
Public{table}class should be used to parse the result of select queries, while theInsertandUpdateobjects should be passed directly into the.insertand.updatepostgrestmethods. As such, they needed to beTypedDicts instead ofBaseModels, for that's what thepostgrestlibrary expects in those functions.This may change in the future, specially in a v3. Ideally, I'd love to be able to infer the type of the arguments for the function from the query builder itself, like TS already does, but I'm afraid that python's type checker is not strong enough to do that. I think a frankenstein can be stitched together from
Literals and@overloads but not sure how scalable and user friendly that would be. I will certainly research more about it before doing a v3 rewrite. For now, this approach suffices.