prisma / prisma

Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB

Home Page:https://www.prisma.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Define type of content of `Json` field

MaximNd opened this issue · comments

commented

Problem

Right now if you have the following schema with Json field:

model User {
  id               Int  @default(autoincrement()) @id
  name             String?
  extendedProfile  Json
}

You'll end up with a problem that you don't have strict type for extendedProfile field in .ts.

const user = prismaService.user.findOne(...);
user.extendedProfile // we don't know the type of extendedProfile

The one way to fix it, is specify some interface in your code and use it like this:

interface UserProfile {
    field1: string;
    field2: number;
}

const user = prismaService.user.findOne(...);
(user.extendedProfile as UserProfile).field1; // now we have autocompletion

But it's not really comfortable to use it like that each time.

Also we can create some class and create instance of it like that:

interface UserProfile {
    field1: string;
    field2: number;
}

class User {
    id: string;
    name?: string;
    extendedProfile: UserProfile;

    constructor(user: PrismaUser /* user object returned by prisma */) {
        // ... initialize
    }
}

const user = new User(prismaService.user.findOne(...));

But this solution creates some overhead due to the creation of an additional object.

Suggested solution

Maybe we can specify type in schema.prisma file like that?

json ExtendedUserProfileJson {
    field1  String
    field2  Int
}

model User {
  id               Int  @default(autoincrement()) @id
  name             String?
  extendedProfile  ExtendedUserProfileJson
}

Alternatives

Alternatively, we can somehow manage this in the typescript.

commented

Have you looked into this?

Advanced usage of generated types

Yes, but i don't understand how it can help in this problem. I may be wrong, but you cannot define the shape of the Json field using this tutorial..

Interesting idea, though all fields should be nullable since there is no guarantee for JSON format enforced by the database.

commented

Interesting idea, though all fields should be nullable since there is no guarantee for JSON format enforced by the database.

Probably, if the database you are using does not have JSON support, then you simply cannot use the Json field type and this feature.

Probably, if the database you are using does not have JSON support, then you simply cannot use the Json field type and this feature.

I think @Sytten is talking about the fact in Json type say in postgres doesn't enforce any schema. It will not guarantee that all data there follows the shape you define. The proposal you have defined here is will all be enforced on application level.

Also, @MaximNd why not define a 1-1 relation here if you want a schema to be maintained, just interested to know. The main selling point of Json type personally for me is it allows me to throw data in it without worrying about it's shape.

commented

Probably, if the database you are using does not have JSON support, then you simply cannot use the Json field type and this feature.

I think @Sytten is talking about the fact in Json type say in postgres doesn't enforce any schema. It will not guarantee that all data there follows the shape you define. The proposal you have defined here is will all be enforced on application level.

Also, @MaximNd why not define a 1-1 relation here if you want a schema to be maintained, just interested to know. The main selling point of Json type personally for me is it allows me to throw data in it without worrying about it's shape.

Yes you are right. In this example I can define a 1-1 relationship, or if I store an array in a Json field, I can define a 1-m relationship. But sometimes, when you have a simple data structure (for example, only a few fields) and you know that this json data will be relate only to this entry, then it is easier to define a Json field. The advantage of this is that you don't need to run an additional query or use JOINS to get the relational data. There might also be a rare case when some users that move their database from NoSql like Mongo to relational database. And there will be a lot of destructured data.

Being able to type your Json fields is a simple and understandable feature requests. Although there might be workarounds, this might very well be something that Prisma could offer in the future on its own - so having this feature request is valid.

commented

I have tried the following workaround. It works fine, until I need a field other than number or string, e.g. Date. Without date field the approach below works. I tried to use transformers for string to date conversion, but that contradicts Prisma.InputJsonObject definition.

import { Type } from "class-transformer/decorators";
import { IsOptional, Length } from "class-validator";

export class Qualification implements Prisma.InputJsonObject {
  @Length(1, 30)
  name?: string;

  @IsOptional()
  age?:number;

  @IsOptional()
  @Type(() => Date)
  birthday?: Date;

  [index: string]: Prisma.JsonValue ;
}

Any suggestions?

@husayt What is this approach? I googled "InputJsonObject" and prisma and only came up with this post and prisma/docs#669 which only has "InputJsonObject" in a list

A note on potential interaction with #2431 and #2444: As JSON may be used for static or dynamic content, this should be opt in - though ideally, you could still select and filter on specific keys even if you have not typed the JSON field. In the specific use case I'm thinking of, I would actually maintain a separate table that would define which rows contain which keys. Specifically, I have a puzzle with multiple solutions. A solution has a JSON field defining some details which depend on the configuration of the puzzle it is for.

I just ran into a need for this similar to OP's - I am using JSON as a way to store reliably structured data that is small, has several child fields, and is always 1:1 linked with the parent. Because of that, it feels wasteful to break it out into another table and require JOINs on every query.

I was expecting something to be available that was akin to the custom scalar operator in the GraphQL SDL - where you can define a name for a field type (and, in Prisma's case, define what type the column should be given in the schema) and then you are responsible for defining the shape yourself in the client code. You could imagine it working something like:

generator client {
  provider = "prisma-client-js"
  scalarDefinitions {
    RuleContents = "../../../prisma-scalars/RuleContents.ts"
  }
}

scalar RuleContents @underlyingType(Json)

model Rule {
  id  String  @id @default(uuid())
  rejectRule  RuleContents?
  favoriteRule  RuleContents?
}

Then the referenced Typescript file would export an interface with the same name as your scalar. In the simplest form, the client would just be applying a cast for data read from the DB and type checking on data written to or queried from it. As an optional future enhancement the scalar definition file might even be able to export a custom serialize and deserialize function that the Prisma client would use to transform data before writing to/reading from the database.

Finally, if you were generating a client where the scalar type wasn't provided, the client could fall back to the @underlyingType and you'd get the same behavior we have now - e.g. if your underlying type was Json it would fallback to a basic any in Typescript, if your underlying was Unsupported(Polygon) it would fall back to hiding the field, etc.

Adding some light to this, it's a common problem with GraphQL. If an input sets the shape of an object, this will fail type validation when interacting with Prisma. Here is an example where ApplicationFieldConfig[] is coming from my GraphQL library.

image

This would be a huge lifesaver! Right now I have to cast each property to unknown and then my own type, while also removing the need of having our own handling for JSON types.

I don't want to start a bad trend here, but +1 from me too! I'd like to be able to customize/extend Json column types in TypeScript.

Thanks for all your work here 🔺!

Solved the issue with typechecking in Nest.js application by using class-transformer's classToPlain + plainToClass:
image

I don't want to start a bad trend here, but +1 from me too! I'd like to be able to customize/extend Json column types in TypeScript.

Thanks for all your work here 🔺!

Yes really mainly because MongoDB is now supported we needed a deep level type-safety, and will also be applicable to Postgres JSON data type

@glebbash solutions works as of now. But at the end you will also wanted a single source of truth, your schema.prisma file

I'm going to drop a suggestion here. A JSON definition written in Prisma would leave the team to need to maintain parity with Typescript in the schema file. A better solution might be to reference the type alone:

model Post {
  id Int @id @defualt(autoincrement())
  data Json @typescript(PostData)
}

The only difficulty here is I'm not sure how prisma would know where to find the PostData definition.

Another possible solution would be to provide types at the client level:

new PrismaClient<{ Post: { data: PostData } }>()

@fny

new PrismaClient<{ Post: { data: PostData } }>()

Would be a good solution here.

@fny @mmahalwy
Great ideas :) I love it how it could be a simple fix, that will give an almost instant solution.
Even though I think it will make a great solution, I think that the whole point is not creating TS types, and have those synced with the prisma schema right?

Yeah, while that would be a pretty easy fix (indeed, one could probably write that as an npm package that wrapped Prisma), it would not actually enforce the type to the same level that Prisma enforces other types from the schema, nor would it necessarily enforce the same type across projects that shared a single schema file for cross-compatibility.

We already accept that Prisma generates Typescript typing for us as part of client generation, so I would rather typing for these json fields were kept in the Prisma schema and generated into TS along with the rest of the models.

We could have a similar approach done by the graphql schema:

type Hotel {
   title: String
   rooms: Room[]
}

type Room {
   name: String
   size: String
}

model Something {
     registration: Int
    hotelData: Json<Hotel>
}
commented

Is there any progress on this ticket? More and more applications are using json columns in their databases and so it would be a great feature to support this, especially because Prisma’s main advantage is type safety.

Up — we really need this feature 🚀

+1

Is there a place where we can check the priority of this feature? I mean I ❤️ Prisma (I really do!), but it seems quite an obvious feature, no?

+1 on this
this is really needed tbh

Is there a place where we can check the priority of this feature? I mean I ❤️ Prisma (I really do!), but it seems quite an obvious feature, no?

https://pris.ly/roadmap @binajmen

Is there a place where we can check the priority of this feature? I mean I ❤️ Prisma (I really do!), but it seems quite an obvious feature, no?

https://pris.ly/roadmap @binajmen

Thank you @pantharshit00. Although typing "json" in the search field does not highlight any backlog entry related to this issue, only 1 archive https://www.notion.so/JSON-field-improvements-5acca22f9a474ab4a8f67e19d412cc25 seems json related, but not answering the feature discussed in this thread.

Is this open issue with 148 👍 and 30 👀 the only way to support this feature implementation?

Does anybody have a simple workaround for this? We are currently forced to do some TS generic wrapping everywhere we return something from Prisma, as the select fields might change, and if containing the JSON-field, it needs to be converted, like:

 Omit<T, "myJsonField"> & {myJsonField: MyJsonType}

where T is some generic that needs to be filled with typeof stuffReturnedFromPrisma...

which leads to code like:

const stuffReturnedFromPrisma = await prisma.MyModel.create(...);
const typedStuff = stuffReturnedFromPrisma as MyJsonTypeConvert<typeof stuffReturnedFromPrisma>;

or similar. And that's pretty verbose.

Is this open issue with 148 👍 and 30 👀 the only way to support this feature implementation?

Yes. Or posting additional use cases and information about how you want to use this.

and use the same approach than mongodb embedded types? #6708 (comment)

Mostly, I think it is the same feature

Please stop spamming this topic!

This is an important feature that I support and I'm subscribed to this GitHub issue to receive notifications about any workaround or actual progress on this issue.

The only way to encourage development is through reactions, ideas, or PR! +1 or anything else does not help at all.

and use the same approach than mongodb embedded types? #6708 (comment)

Mostly, I think it is the same feature

Yes, this feature here will definitely build on the same internal functionality as Embedded Documents / composite types for MongoDB. But we have to confirm that it works for MongoDB first and finish that, before we can move on to also release this for relational databases. But this should give you an indication that this is definitely not years off.

(But please come and help us test MongoDB! We can use anyone who has a MongoDB database to play around with it via db pull and then write a small Prisma Client using application for it.)

Hey @janpio! Any progress on this feature?
We need it badly...

Hey @janpio - Given this is now internally implemented through embed docs in MongoDB what would be involved in getting this working for the Json field? Auto generated type safety for the Json type would be massively helpful for Cal.com and honestly would improve our metadata related code 10x.

Willing to help make this happen 👍

Good catch, the existence of this other feature validated our approach here and will make it possible to take a lot of that logic and apply it to relational databases. We are just not there yet.

Would what you can see on https://www.prisma.io/docs/concepts/components/prisma-client/composite-types solve all your needs for this issue for other databases?

Absolutely; Just brainstorming, but something like the following - I speculate that going just appearance Appearance may be hard to parse due to potential for naming conflicts?

type Appearance {
  backgroundColor String
  foregroundColor String
}

model MyUser {
  appearance Json<Appearance>
}

const user = prisma.myUser....;
// user.appearance.<backgroundColor | foregroundColor> is now typehinted 🥂 

Good catch, the existence of this other feature validated our approach here and will make it possible to take a lot of that logic and apply it to relational databases. We are just not there yet.

Would what you can see on https://www.prisma.io/docs/concepts/components/prisma-client/composite-types solve all your needs for this issue for other databases?

When I searched for this feature, I came across the composite types approach for MongoDB, and it seems like a perfect fit (until I understand it was not available for other databases).
The approach seems to be clean and simple to understand, a perfect fit for simple objects so we don't have to represent them in separate tables ...
But there may be some edge cases I am not thinking of for complex schema definitions (nesting, union, etc).

https://www.prisma.io/docs/concepts/components/prisma-client/composite-types

I'm trying to come up with a hack/workaround to enable this for Postgres until this is added in master. Has anyone else attempted and made progress with this yet?

I would love have this feature. <3
Example I want user can choose public his email or not

Profile {
    email : { isPublic: true, data: "abc@gmail.com" }
}

I dont want create more table to store this

As much as I love the schema-driven approach of Prisma, this (and the poor migration rollback support) was a deal-breaker for the project my team is currently working on, and we had to move back to TypeORM 🫤

A great solution to this issue would be adding composite types to the database providers other than mongodb. Prisma has a great documentation for the composite types here: Composite Types | Prisma Docs.
I, and, I believe, many other developers would love this feature to be added to Prisma.

https://www.prisma.io/docs/concepts/components/prisma-client/composite-types

I'm trying to come up with a hack/workaround to enable this for Postgres until this is added in master. Has anyone else attempted and made progress with this yet?

Error validating: Composite types are not supported on Postgres.
This error message emerges, when you try to use composite types.

I'm not seeing this in the roadmap, but it would be great to have something like Embedded Documents on MongoDB but for SQL databases

I joined a meeting with Matt Mueller PM @ Prisma last month and one of our talks was about this specific feature. They said it was in their roadmap and they already talked internally with their team about it.

As I talked with them before, this feature would help our company a lot. I'm also seeking to hear more about this being implemented in the near future.

Besides waiting, I don't know if we can do anything else to help.

I think discussion of composite types is muddying the issue. The most straightforward (though non-trivial) solution would be to define a type for your Json column in your prisma schema, so that the client can be typed correctly - but store it in the DB as pure JSON. If your client is the only one writing to the DB, as it is in many applications, then it doesn't matter that the DB doesn't verify it, no?

In any case, this problem is a massive pain point in Prisma today and I was surprised to find there's no easy/clean solution yet. But hopefully we get one sooner than later.

@scatter-dev

  • AFAIK, composite types are exactly how it is being proposed those types should be defined - I’m not sure that it’s actually any different from what you’re suggesting
  • I’m not sure if there’s any possibility or intent for in-DB validation, but if it’s possible it still definitely has a place. For some reason or another, you may want to be running raw queries, or querying directly against the DB, or working with the DB in another app that isn’t able to use Prisma. Enforcing constraints at the DB level means a higher level of assurance around data integrity

Enforcing constraints at the DB level means a higher level of assurance around data integrity

Sure, but my understanding is that generally isn't possible with SQL Json columns, which is why the only other answer is DB-implemented composite types. Maybe that's the way to go anyway, but it feels like oversolving the problem (which, fundamentally, is about TS typing and not the actual data in the DB) to me. But I acknowledge that other clients inserting non-compliant data is a real issue.

I think my question is what is actually being proposed when people are referencing composite types here. 😄I had the impression it was “use the type x {} syntax to define the schema for JSON columns”. I assume your assumption is thinking that it would mean something like CREATE TYPE (#4263) or just like having a reusable type that when used expands to multiple fields?

(@scatter-dev I think the composite types mentioned here, are indeed exactly what you suggest. Nothing database side - unless of course turns out to be possible, useful and simple enough. I don't think it is though.)

Exactly, I believe we're looking to have the same behavior that we have on MongoDB but for Json types, no db enforcement

Hello! I really need this feature, extremely need 🙂 May be it is possible to pay for it development? I think we could raise money for it

+1

I don't think composite types would 100% fulfill this feature request, unless more is added to it. Example, how could one do something like:

type ChildObject = {
  id: number;
  name: string
}

type RootObject = {
  id: number;
  childrenDictionary: Record<string, ChildObject>;
}

More specifically the Record<string, ChildObject> part.

Before implementing everything needed to map ts types 1-1, I think a first version would be more useful by working like so:

schema.prisma:

model User {
  id               Int      @default(autoincrement()) @id
  name             String?
  extendedProfile  Json     @type("./types/profile", "extendedProfile")
}

A bit similar as to how nexus allows you to declare types for its context type https://nexusjs.org/docs/adoption-guides/neuxs-framework-prisma-users#configuring-context-type

This way. We can use our own typescript definitions and are not constrained by whatever is or isn't implemented as a composite type

A potential problem with tying it to a TS types file would be lack of portability when using clients in other languages.

And it completely removes the purposes of a schema file, as you are referencing source codes anyways... But i'm not removing this idea, I've found myself needing for records in mongodb schemas and ended up creating nested arrays with js finds at runtime :/

TS should be the place to define external JSON types, as it provides the required expressiveness for complex/nested/cyclic types.

Also, you often "inherit" types from libraries, like for example rich text formats or graph (edge/node) formats, which you would have to manually redefine in the schema-DSL, otherwise.

Prisma may be not the single source of truth for JSON formats, unlike the rest of the table schema.

If you get anything wrong or the type generation of the schema turns out slightly different from your expected types, you will end up in type mismatch hell, 🔥 and were better off just overriding the types...

For anyone interested, I'm currently using a workaround for this, that preserves the Prisma-selection fields, while replacing all JSON types, even in nested structures, if necessary. It's a bit of a PITA to set up, but works flawlessly, once done.

Discussed here with Brandon from Blitz:
blitz-js/blitz#3723 (reply in thread)

here the snippet:

type Nullish = null | undefined;

// Our domain model is called "Item", so we name the Prisma type `PrismaItem`
type PrismaItem = Record<string, unknown> & {
  // these fields are JSON in the table and need to be replaced
  notepad?: Prisma.JsonValue | null;
  propertyValues?: Prisma.JsonValue;
  variantChoiceGuidance?: Prisma.JsonValue | null;
};

// This is a type converter for all fields to be converted

export type FromPrismaItem<
  IOptional extends Nullish | PrismaItem,
  I extends PrismaItem = Exclude<IOptional, Nullish>
> = IOptional extends Nullish
  ? IOptional
  : Omit<I, "notepad" | "propertyValues" | "variantChoiceGuidance"> & {
// these are custom JSON types defined by us or by libraries we use
      [Key in keyof I]: Key extends "notepad"
        ? RichText | null
        : Key extends "propertyValues"
        ? ItemPropertyValues
        : Key extends "variantChoiceGuidance"
        ? RichText | null
        : I[Key];
    };

// Type conversion respects, `select`, nullable fields, undefined 
FromPrismaItem<null> // => null
FromPrismaItem<undefined> // => undefined
FromPrismaItem< {id: string, title: string }> // => { id: string, title: string}
FromPrismaItem<{id: string, title: string, notepad: Prisma.JsonValue,  }> // => { id: string, title: string, notepad: RichText }
FromPrismaItem<{id: string, title: string, notepad: null, variantChoiceGuidance: Prisma.JsonValue }> // => { id: string, title: string, notepad: null, variantChoiceGuidance: RichText }

You can of course re-use this type converter in other type converters, which will replace nested JSON, or just put it in a helper function fromPrismaItem or fromPrismaItemArray which you just wrap around your prisma.item.findMany etc. call.

I created a new Type that my app consumes for the front-end, as for the backend (I am using Next.JS) we all know we can just get that JSON and stringify it (well, validate it as you wish before that).

import { PrismaEntityA } from "@prisma/client";

export type PrismaEntityAResponseType = PrismaEntityA & {
  schema: PrismaEntityAJsonSchemaType<PrismaEntityA>;
};

Now I can use whatever type PrismaEntityAJsonSchemaType provides me. Also, note that PrismaEntityAJsonSchemaType uses the parent type to handle some neat stuff like keyof types and whatnot.
I do not know how Prisma team would implement such need (interface PrismaEntityAJsonSchemaType<PrismaEntityA> { ... }), but if possible it would be amazing.

This is the solution I use to handle JSON type.
We created a Type to handle JSON and change the type based on the property key, it can also handle nested object / array

eg: we have this User and we want to change name property value to { firstname: string; lastname: string } and age to { date: number; month: number, year: number }

type User = {
  name: JsonValue;
  age: JsonValue;
  friends: User[];
  father: User
}

type CorrectNameType = { firstname: string; lastname: string }
type CorrectAgeType = { date: number, month: number, year: number }

THIS IS THE TYPE (feel free to change the KeyType extends 'your property key' and the CorrectAgeType or CorrectNameType based on your type

type ReplaceTypeObject<TypeObject> =
  TypeObject extends object
  ? ReplaceTypesByKey<TypeObject>
  : TypeObject;

type ReplaceTypesByKey<ObjType extends object> = {
  [KeyType in keyof ObjType]:
    KeyType extends 'age' ? CorrectAgeType :
      KeyType extends 'name' ? CorrectNameType :
        ReplaceTypeObject<ObjType[KeyType]>
}

type NewPrismaItem<ObjType extends object> = ReplaceTypesByKey<ObjType>;

Wrap it to your object

type NewUser = NewPrismaItem<User>

RESULT

declare let oldExample: User;
oldExample.age.date // ❌
oldExample.name.firstname // ❌
oldExample.friends[0].name.firstname // ❌
oldExample.father.name.firstname // ❌

declare let newExample: NewUser;
newExample.age.date // ✅ 
newExample.name.firstname // ✅
newExample.friends[0].name.firstname // ✅
newExample.father.name.firstname // ✅

EXAMPLE REAL LIFE IMPLEMENTATION

const user = await prisma.findUnique......
....

return user as NewPrismaItem<typeof user>

Hope it helps 👍

Thought I'd mention what we ended up doing: we're using the zod-prisma package with custom Zod types applied to each Json field in our Prisma schema. Zod-prisma outputs z.objects for each of our Prisma models, where the Prisma Json fields are actually automatically typed according to our custom z.object for each field. This means that we have z.objects for each Prisma model generated along with our Prisma client which strongly type the Json fields.

@scatter-dev @janpio @Negan1911

Enforcing constraints at the DB level means a higher level of assurance around data integrity

Sure, but my understanding is that generally isn't possible with SQL Json columns, which is why the only other answer is DB-implemented composite types. Maybe that's the way to go anyway, but it feels like oversolving the problem (which, fundamentally, is about TS typing and not the actual data in the DB) to me. But I acknowledge that other clients inserting non-compliant data is a real issue.

Schema validation for JSON is natively supported on MySQL and MongoDB and with an extension on PostgreSQL. This would enable database-side integrity protections for JSON fields.

This would require generating a set of JSON Schema rules for each defined type, but this shouldn't be an issue.

Schema validation for JSON is natively supported on MySQL and MongoDB and with an extension on PostgreSQL. This would enable database-side integrity protections for JSON fields.

This would require generating a set of JSON Schema rules for each defined type, but this shouldn't be an issue.

Sound really cool, maybe we could create a new issue to track this, but having this types solved + json schema constrain on prisma would make the use of the JSON field really great and powerful

A few years ago I built a GraphQL framework that was handling JSON structures in a Postgres DB based on the defined schema and it took me a few iterations to get to a satisfying solution.
I would advise against going with any extensions for Postgres (or any other DB).

  1. It's not always possible to use extensions. Some cloud providers may not have such options.
  2. It is too specific to the given DB engine. Building and testing validation for each system would be a lot of work.
  3. Migrations would be difficult and hard to automate.

There's a reason why we want to use JSON fields in a relational DB. We want that extra flexibility.
But that flexibility is defined differently for each person.
For one developer it might be important to always migrate changes in the JSON schema and for another, it would be enough to handle schema differences in the data set directly in the app.

And then we have also the topic of how validation should be done.
JSON schema validation is a long-established solution, with many libraries to choose from, but we also have now more TypeScript-oriented solutions like Zod.
I believe it is impossible to implement a validation that will make everyone happy.

IMO, the simplest approach to this problem would be to provide a way for developers to define a type for a given JSON field.
I don't want to go into the details of how this should happen as I don't know the internals of Prisma, but anything (as shown in examples in this thread) from an attribute definition like @type(SomeType) to injecting the Prisma client with a type mapping would work I think sufficiently.

This would enable us to use TypeScript with these JSONs and act on them as we desire.
How validation of the input and output should work is an implementation detail that should be a choice that a developer can make.
The doors would be open for e.g. using class transformers with a Nestjs DTO, or having Zod within a tRPC/Next setup, or a plain JSON schema validation via Ajv.

Maybe the possibility of defining a parser function could be helpful as well.
So, whenever a JSON field is being read, that parser would be triggered and the developer could implement handling logic for validating the structure (and possibly mitigating legacy structures).

Here's an example of how I define the type of a nested JSON field while using tRPC:

const byIdBuildSelect = Prisma.validator<Prisma.BuildSelect>()({
  id: true,
  projectId: true,
  buildId: true,
  comparisons: {
    orderBy: {
      id: 'desc',
    },
  },
})

type ByIdOutput = z.infer<
  z.ZodType<Prisma.BuildGetPayload<{ select: typeof byIdBuildSelect }>>
> & {
  comparisons: Array<{
    meta: ShotComparison['meta'] | null
  }>
}

export const buildRouter = router({
  byId: protectedProcedure
    .input(
      z.object({
        id: z.string(),
      })
    )
    .query(async ({ input, ctx }): Promise<ByIdOutput> => {

This is just handling the output types, so I get to consume them in my React app, input validation is a topic on its own.
Having those JSON fields typed in Prisma would cut down a lot of boilerplate throughout my code base. 😅

But again, this is just my use case and my choice of libraries for the given project.
Keeping this part flexible for developers to decide how to implement validation would be a major plus. 🤘

@chriskalmar what @agucova is talking about is at the database level, which is supported by mysql natively, you have a great point that may not be on postgres given if you can add extensions or not. Having a (optional) way to setup your db json field constrain beats any kind of validator because it assures you that the shape you want is you will want to receive, not really depending on the prisma layer to do a validation or not.

Thank you @Negan1911 🙏 for sharing the details on Mysql's native JSON validation support.
I took a quick look at their docs and it seems to be Draft 4 of JSON schema.
Out of curiosity, I wanted to check how migrations are being handled, but I couldn't find anything on that. Maybe you could share your experiences.

Anyhow, I spent many years building DB models and procedures, and there was even a time when I was much closer to the DB than nowadays.
I was validating JSON schemas within PL/v8 (JavaScript for Postgres) procedures using Ajv and DB triggers. 😅
The sheer fact that it was possible to write JavaScript code inside of Postgres was just incredible.
But it all came at the heavy price of a really bad DX.

Since then I learned to value a DB for what it can really well:

  • indexing data
  • datatype validation
  • referential integrity
  • uniqueness constraints
  • and so on ...

Validating JSON schemas is something that I'd rather control on another level.
It's important for the DB to check if it is a valid JSON on its own, but everything that goes deeper should end up in the application logic.
If you think about it, a similar case would be the validation of an email attribute.
Sure you can put a Regex constraint on the DB field, but wouldn't you rather have it in your app code where you have plenty of tooling to build the best possible validation, including unit tests?
Or another example: range constraints on numeric fields.

On top of that, dealing with DB errors is not that great and I'm just trying to picture Prisma needing to translate the JSON schema errors into actionable intel for the developer to act on. As mentioned before, I believe this to be a very tough task and seems to be quite out of scope of what Prisma is. But that's something for the core maintainers to assess.

Furthermore, JSON schema has its limitations too. Some validation logic won't work in a declarative fashion and will only work in actual code. Also, I don't like the fact that I would have to upgrade the DB version to get a newer JSON schema draft version.

And finally, I don't want to burden my DB with complex, CPU- and memory-intensive validation work if I can do it in the app. Scaling a DB is a nightmare. Scaling a stateless app is a walk in the park. 😁

Please don't get me wrong, I have nothing against JSON schema validation, in fact, I'm still using it in many places. It's a great standard.
And also, I 💜 hearing about people using interesting features of a DB that are not just about simple data storage. 🙌

I just believe that in the case of Prisma it might be a better approach to enable developers with a way to hook into the validation/parsing process of JSON fields but at the same time to leave the implementation details to them.
This would allow for a faster addition of this much-needed feature and it would be more future-proof than any deeper integrations with DB engines. Because today it's JSON schema, tomorrow it's Yup or Zod, and 3 months from now who knows?
I'm certain that the DB providers will take longer to adopt new approaches and then Prisma would need to update those adapters before we would get to use them.
That's why I'd prefer for Prisma to leave the validation details to me because I'm just one npm install away from using the latest and greatest what the validation world has to offer. 😁

Now the good news is, if Prisma has a flexible approach to this, you could still transform (there's for sure a lib for that) your types into JSON schemas and apply them to your Mysql DB. It would be the developer's choice to do so.
I often use JSON schemas for metadata or new features that I want to test with customers and I don't want to migrate a few hundred thousand rows just yet. A simple data validator lets me know if it's the new structure or not and I get to decide how to display that data and if I want to migrate it right after that. A rigid DB-level validator would make it difficult. And once I decide that I want to stabilize that feature I most likely move that JSON property into a dedicated field.
Your use case might be a completely different one, but we both would be able to build our solutions on top of a flexible JSON validation design.
Wdyt?

Sorry for the long post 🙈

Out of curiosity, I wanted to check how migrations are being handled, but I couldn't find anything on that. Maybe you could share your experiences.

They are constrains added to a table, when you alter the table you can create, update or drop them, so you could handle them on migrations like any alter table (at least, on MySQL).

Anyhow, I spent many years building DB models and procedures, and there was even a time when I was much closer to the DB than nowadays.
I was validating JSON schemas within PL/v8 (JavaScript for Postgres) procedures using Ajv and DB triggers. 😅
The sheer fact that it was possible to write JavaScript code inside of Postgres was just incredible.
But it all came at the heavy price of a really bad DX.

Well, that's your approach, which can be correct for some cases and maybe not for some other cases, what we want is options, since it's already supported by the DB, it would be good to have it for whoever want to use it.

Since then I learned to value a DB for what it can really well:

indexing data
datatype validation
referential integrity

"datatype validation" is exactly what we could have with JSON schema, it would help us at least have a safeguard to what can be put into a Json column.

If you think about it, a similar case would be the validation of an email attribute.
Sure you can put a Regex constraint on the DB field, but wouldn't you rather have it in your app code where you have plenty of tooling to build the best possible validation, including unit tests?

It depends, maybe the DB is consumed by different apps, maybe not even on the same language.

If you think about it, a similar case would be the validation of an email attribute.
Sure you can put a Regex constraint on the DB field, but wouldn't you rather have it in your app code where you have plenty of tooling to build the best possible validation, including unit tests?
Or another example: range constraints on numeric fields.

Why not? Sure, what you say is an approach, maybe a good one on some cases, but it's not the absolute truth, if that's the case, one can say that validating if something is null shouldn't be done on the db.

Maybe I don't want to rebuild my validation methods in three different languages because I have three different clients, or, maybe, I just want to make use of the power that the DB provides.

Or, even maybe, I don't want for my app to spectacularly blow up because the column that it expected to be an array of string has an integer, or is malformed.

On top of that, dealing with DB errors is not that great and I'm just trying to picture Prisma needing to translate the JSON schema errors into actionable intel for the developer to act on.

In the same way that validating for a unique constraint is and we already have that, heck, somebody just point me a sample of how I can build those attributes in this repo and I would give it a shot!

And finally, I don't want to burden my DB with complex, CPU- and memory-intensive validation work if I can do it in the app. Scaling a DB is a nightmare. Scaling a stateless app is a walk in the park. 😁

Maybe because you're biased by your own approach, maybe I'm using Neon or Planetscale, or maybe, for me and many other people, the benefits of validating in the DB outweigh any downside like that.

I just believe that in the case of Prisma it might be a better approach to enable developers with a way to hook into the validation/parsing process of JSON fields but at the same time to leave the implementation details to them.
This would allow for a faster addition of this much-needed feature and it would be more future-proof than any deeper integrations with DB engines. Because today it's JSON schema, tomorrow it's Yup or Zod, and 3 months from now who knows?

That approach doesn't solve the fact that anyone could write bad data to the DB and there's no constraint to protect you against, sure, put whatever validation engine you want on prisma, I could just break it with a inline INSERT, using the Json schema provided by the DB you will not be able to do that (well, unless you really want to).

Because today it's JSON schema, tomorrow it's Yup or Zod, and 3 months from now who knows?

Except that you don't load JS or any library in the DB, you just set a schema and let the DB handle it. Json Schema is not a fancy library, is a specification.

That's why I'd prefer for Prisma to leave the validation details to me because I'm just one npm install away from using the latest and greatest what the validation world has to offer. 😁

I don't want prisma to enforce validation details, I want prisma to let me use the tools MySQL / postgres provides (specially now that we have extension management).

Now the good news is, if Prisma has a flexible approach to this, you could still transform (there's for sure a lib for that) your types into JSON schemas and apply them to your Mysql DB. It would be the developer's choice to do so.

Unless you can tap into the schema and let prisma know how to enforce those in the DB, I see that difficult.

I often use JSON schemas for metadata or new features that I want to test with customers and I don't want to migrate a few hundred thousand rows just yet. A simple data validator lets me know if it's the new structure or not and I get to decide how to display that data and if I want to migrate it right after that. A rigid DB-level validator would make it difficult. And once I decide that I want to stabilize that feature I most likely move that JSON property into a dedicated field.

No, because you don't have any guarantee that you can't write any other thing to the db that you will not expect to pull. I don't want to know that my data is bad when pulling, I want to know when saving it.
And if the solution is "validate client side", then we could say the same for unique constraint or null constraint, and we effectively dumb down the DB.

Your use case might be a completely different one, but we both would be able to build our solutions on top of a flexible JSON validation design.
Wdyt?

No, because again, nothing guarantees that I can't write garbage to the DB, unless the constraint is in the DB.

I don't want prisma to enforce all people to use a constraint, if you case is a monolith or a single app running on your db, you may be OK with validating on the client side, but I don't see why we should lose the feature to validate in the DB.


Again, to recap. I think the misunderstanding is that this topic is treated on the define the content of Json field issue which is not the same thing. I expect this topic to continue being only about TS Types, but, if I also want to be able to set my constraints on the db (like a json schema) be able to do so.

Also, because this may be a confusing part: I don't want prisma to validate anything, I just like to find a way to express on prisma I want to set constraints on the DB.

This wont limit the current functionality of prisma, if folks want to continue validating on client side (rightfully so, and the best approach on some cases), they can. But this wont limit us just to this approach.

@chriskalmar, I second (most of) what you wrote 💙

There are usually business rules that go beyond the capabilities of JSON schema or the database in general anyway.

If you have multiple applications, and they share at least one of these business rules, they either have to share some code (i.e. libraries, e.g. Prisma), or you will (hopefully) go through some universal REST-API or GraphQL-service anyway, that enforces these business rules.

Without these, you will always be able to write "garbage" to the DB, at least from a business perspective.

Again, to recap. I think the misunderstanding is that this topic is treated on the define the content of Json field issue which is not the same thing. I expect this topic to continue being only about TS Types, but, if I also want to be able to set my constraints on the db (like a json schema) be able to do so.

Exactly 👌. I think DB-side or Prisma-side validation of JSON should be a separate issue (is there one already?). Maybe I got it wrong, but I was thinking this issue here was like:

"Hereby I, the mighty developer, state, that within the realm of my Prisma-using code, this JSON-column shall beth of type Foo!" 👑

No matter what the DB really contains. So i thought this issue is more about, what I say that is in the DB, when fetching from the DB, not writing to it. Obviously in that case, it's the application code that should define the TypeScript types to use.

While you two seemed to be talking about asserting type-safety via Prisma (directly or via underlying DB functionality) for writes as well.

We should maybe clarify what this is about?

(Interesting discussion going on here. See also the new, related issue #16654 that is specifically about JSON Schema constraints for data)

@nickluger

No matter what the DB really contains. So i thought this issue is more about, what I say that is in the DB, when fetching from the DB, not writing to it. Obviously in that case, it's the application code that should define the TypeScript types to use.

Yeah, you are totally right, that's why I created #16654 so we don't confuse things.

I may have been caught on the discussion and we lost the real point of this ticket, which is to be able to define the types of the Json field on the prisma schema.

At least from what I understand, this ticket does not involve anything about validating the data, it's just be able to define the shape on the prisma schema and get a proper TS accordingly, which is like prisma does for MongoDB embedded documents.


If someone is interested on keeping the conversation about Json schema / check constraints we could also do it on #16654

At least from what I understand, this ticket does not involve anything about validating the data, it's just be able to define the shape on the prisma schema and get a proper TS accordingly, which is like prisma does for MongoDB embedded documents.

Composite types for MongoDB also "validate" the data, during writing and reading - but in Prisma Client and not the database. Possibly the better differentiation here is that this issue is about Prisma CLient and Engine behavior, and #16654 is about the database, Migrations and Introspection. And of course the combination of both would be most powerful - but both can also exist by themselves.

Out of curiosity, I wanted to check how migrations are being handled, but I couldn't find anything on that. Maybe you could share your experiences.

They are constrains added to a table, when you alter the table you can create, update or drop them, so you could handle them on migrations like any alter table (at least, on MySQL).

Yes, very true, but the problem with enforced JSON schemas here would be that you would need to deal with a data migration during the structural migration. And that is something that I would rather keep optional for each use case.
Some devs would like to migrate the structure right away, some would like to deal with it on data consumption.
I don't know how Mysql applies that JSON schema change, but I would assume it expects the data to conform to it, and so it would block a migration if the data is in the wrong shape, right?

Anyhow, I spent many years building DB models and procedures, and there was even a time when I was much closer to the DB than nowadays.
I was validating JSON schemas within PL/v8 (JavaScript for Postgres) procedures using Ajv and DB triggers. 😅
The sheer fact that it was possible to write JavaScript code inside of Postgres was just incredible.
But it all came at the heavy price of a really bad DX.

Well, that's your approach, which can be correct for some cases and maybe not for some other cases, what we want is options, since it's already supported by the DB, it would be good to have it for whoever want to use it.

I'm not opposed to that. I just wouldn't love the fact of mixing the TypeScript definition in the client with a rigid JSON schema validator in the DB.
But from what I can tell where the discussion is going those 2 topics need to be treated separately anyway. 👍

Since then I learned to value a DB for what it can really well:

indexing data datatype validation referential integrity

"datatype validation" is exactly what we could have with JSON schema, it would help us at least have a safeguard to what can be put into a Json column.

The datatype of a JSON field is JSON, any JSON for that matter. Like Integer, Text, and so on.
If you want to have a more detailed type (e.g. only even integers, text that contains no special characters, JSON in a specific shape, ...) you will need to enforce additional rules on top of the original datatype.
I know what you mean, but still, if looking at the native datatype, Prisma and the DB are doing already the job of validating a JSON field.

If you think about it, a similar case would be the validation of an email attribute.
Sure you can put a Regex constraint on the DB field, but wouldn't you rather have it in your app code where you have plenty of tooling to build the best possible validation, including unit tests?

It depends, maybe the DB is consumed by different apps, maybe not even on the same language.

Yes, agreed, this could be a good use case for having a JSON schema validation in place.
But to be honest, having multiple apps writing to the same DB table has a certain architectural smell to it. Usually, each table is owned by an app, not only on write but also on read operations. It should expose an API for other apps to interact with that data. Multiple apps in different languages sounds like 2 teams would be working on it. Now imagine a JSON schema change on the table, I think that would be a bit painful and such approaches wouldn't scale well in a company.
Still, there might be edge cases where this is a valid solution and a strict schema on the JSON field would definitely help.

If you think about it, a similar case would be the validation of an email attribute.
Sure you can put a Regex constraint on the DB field, but wouldn't you rather have it in your app code where you have plenty of tooling to build the best possible validation, including unit tests?
Or another example: range constraints on numeric fields.

Why not? Sure, what you say is an approach, maybe a good one on some cases, but it's not the absolute truth, if that's the case, one can say that validating if something is null shouldn't be done on the db.

Null is only indicating if a value is set or not. It is shared among all datatypes but it is not defining or restricting datatypes in any way. I'm afraid, I fail to see the point.

Maybe I don't want to rebuild my validation methods in three different languages because I have three different clients, or, maybe, I just want to make use of the power that the DB provides.

As mentioned above, I think that letting multiple apps (regardless of languages being different) have write access to the same table is problematic. But that's just my view, I not speaking for everyone.

Or, even maybe, I don't want for my app to spectacularly blow up because the column that it expected to be an array of string has an integer, or is malformed.

I understand that. And I see your argument as a valid one. I would just prefer to be able to hook into Prisma's read event to run my own validation. JSON schema has limitations. So, instead of having to maintain a JSON schema for the database and a Zod (or any other solution) schema separately, I would rather have just one source of truth.

On top of that, dealing with DB errors is not that great and I'm just trying to picture Prisma needing to translate the JSON schema errors into actionable intel for the developer to act on.

In the same way that validating for a unique constraint is and we already have that, heck, somebody just point me a sample of how I can build those attributes in this repo and I would give it a shot!

I think that's a great idea. A PoC might help in the decision process and could convince people to the one or the other approach. 👍

And finally, I don't want to burden my DB with complex, CPU- and memory-intensive validation work if I can do it in the app. Scaling a DB is a nightmare. Scaling a stateless app is a walk in the park. 😁

Maybe because you're biased by your own approach, maybe I'm using Neon or Planetscale, or maybe, for me and many other people, the benefits of validating in the DB outweigh any downside like that.

Of course, I'm biased. I'm just human, it's my nature. 😅
I just kindly want to remind you, that Prisma should work on any setup. We don't always get to choose where the DB runs.
That's also the reason why I believe that needing an extension (e.g. in the case of Postgres) is not the best idea. Because my environment might not support that. Therefore also, having a super-scalable and managed DB is not always possible. Apart of the fact that I wouldn't welcome occupying the DB with validating non-native datatypes, I believe that the solution that Prisma should be aiming for should be as open as possible.

I just believe that in the case of Prisma it might be a better approach to enable developers with a way to hook into the validation/parsing process of JSON fields but at the same time to leave the implementation details to them.
This would allow for a faster addition of this much-needed feature and it would be more future-proof than any deeper integrations with DB engines. Because today it's JSON schema, tomorrow it's Yup or Zod, and 3 months from now who knows?

That approach doesn't solve the fact that anyone could write bad data to the DB and there's no constraint to protect you against, sure, put whatever validation engine you want on prisma, I could just break it with a inline INSERT, using the Json schema provided by the DB you will not be able to do that (well, unless you really want to).

Why would somebody write bad data into the DB on purpose? I assume that inputs are validated prior to saving them (Never trust the client).
But if there is indeed a good reason for writing directly to the DB (which I'm still trying to understand why) you would just need to validate the data upfront. This just sounds like an extreme case, because usually we should have only one way of processing and storing information. How else would you make sure that all business rules apply each time? JSON schema won't fix that.

Because today it's JSON schema, tomorrow it's Yup or Zod, and 3 months from now who knows?

Except that you don't load JS or any library in the DB, you just set a schema and let the DB handle it. Json Schema is not a fancy library, is a specification.

Maybe not with Mysql, but for Postgres you would need an additional extension. And (just to name an example) last time I checked, Amazon has only a limited number of allowed extensions for their Postgres RDS service, and JSON schema validation is not one of them. But again, it doesn't matter if it's Amazon or another provider, some service always has some sort of limits.
Also, what about SQLite and MS SQL Server? Wouldn't it be great to have validation capabilities for those engines as well? Not that I'm using any of these, but again, Prisma should stay inclusive in my opinion.

That's why I'd prefer for Prisma to leave the validation details to me because I'm just one npm install away from using the latest and greatest what the validation world has to offer. 😁

I don't want prisma to enforce validation details, I want prisma to let me use the tools MySQL / postgres provides (specially now that we have extension management).

I didn't say anything about enforcing validation. It should be something that developers can opt-in.

Now the good news is, if Prisma has a flexible approach to this, you could still transform (there's for sure a lib for that) your types into JSON schemas and apply them to your Mysql DB. It would be the developer's choice to do so.

Unless you can tap into the schema and let prisma know how to enforce those in the DB, I see that difficult.

I don't know if Prisma provides (or will provide) a hook to run custom commands after a migration job, but it shouldn't be that difficult to just have a custom npm script that combines the migration job with the JSON schema "injection" into the DB.
Maybe not very elegant, but it could work.

I often use JSON schemas for metadata or new features that I want to test with customers and I don't want to migrate a few hundred thousand rows just yet. A simple data validator lets me know if it's the new structure or not and I get to decide how to display that data and if I want to migrate it right after that. A rigid DB-level validator would make it difficult. And once I decide that I want to stabilize that feature I most likely move that JSON property into a dedicated field.

No, because you don't have any guarantee that you can't write any other thing to the db that you will not expect to pull. I don't want to know that my data is bad when pulling, I want to know when saving it. And if the solution is "validate client side", then we could say the same for unique constraint or null constraint, and we effectively dumb down the DB.

Sorry, but I don't see how this would dumb down the DB here. I already explained my reasoning regarding Null constraints. And when it comes to uniqueness constraints it's a whole different topic. So far, we were talking about row-level data validation, uniqueness constraints on the other hand are table constraints, which require an index and so on. I don't see how that relates to a JSON schema validation.

Furthermore, I don't need guarantees. I have a validator when writing the data and thus I know what the structure will be like.
And in case I want to experiment with structures, I can do that too. I just need to be prepared on the client side to deal with various versions of structures of a given field. This would be a perfectly normal case for a new feature that I would want to roll out to a limited list of users. By enforcing a JSON schema on the DB directly, I have to migrate with every tiny change. Also, I wouln't be able to flexibly change the schema, I would need to extend it so it can still hold the old structure next to the new one.

Your use case might be a completely different one, but we both would be able to build our solutions on top of a flexible JSON validation design.
Wdyt?

No, because again, nothing guarantees that I can't write garbage to the DB, unless the constraint is in the DB.

I wonder why that should be even possible in the first place? If you need to manually run insert/update statements on your DB, then well, you always need to be careful. But I think this is a bit far outside of the scope in regards to Prisma.

I don't want prisma to enforce all people to use a constraint, if you case is a monolith or a single app running on your db, you may be OK with validating on the client side, but I don't see why we should lose the feature to validate in the DB.

Look, as already said, I'm not opposed to that. If Prisma decides to follow that path it's fine too. I'm sure many developers would appreciate using JSON schema validation with Mysql. I just don't think it is a flexible and inclusive enough approach. Prisma is not just a Mysql solution.

Again, to recap. I think the misunderstanding is that this topic is treated on the define the content of Json field issue which is not the same thing. I expect this topic to continue being only about TS Types, but, if I also want to be able to set my constraints on the db (like a json schema) be able to do so.

Also, because this may be a confusing part: I don't want prisma to validate anything, I just like to find a way to express on prisma I want to set constraints on the DB.

This wont limit the current functionality of prisma, if folks want to continue validating on client side (rightfully so, and the best approach on some cases), they can. But this wont limit us just to this approach.

Yes, I think that is a very valid point. Even tho JSON schema validation within the DB and client side input/output validation
are on the same topic, I would also agree that both approaches solve a bit different problems and each one of them has their very specific pros and cons.

Splitting these topics makes absolutely sense. 🙌

Look, as already said, I'm not opposed to that. If Prisma decides to follow that path it's fine too. I'm sure many developers would appreciate using JSON schema validation with Mysql. I just don't think it is a flexible and inclusive enough approach. Prisma is not just a Mysql solution.

I don't understand if you're not willing to use that why you invest so much to debunk it. I mean, having JSON schema constraint will not do anything to you if you don't plan to use it.
How is "not inclusive" something that you can decide to use it or not?

Im not quite sure what you refer as "follow that path" as if having this feature would render the framework unusable for you or anyone else that doesn't use this, doesn't make any sense. It will be like any other feature that prisma have, if you don't want to use it, just don't declare any constraint and that's it

Prisma is not just a Mysql solution.

Neither a postgres one but they have PostgreSQL extensions and I don't see non-postgres user complaining.

It's a valid user case, maybe not for you, but for a lot of people (like me) is, it doesn't really affect anyone who don't want to use and it definitively doesn't affect the platforms that are outside of this scope.

Look, as already said, I'm not opposed to that. If Prisma decides to follow that path it's fine too. I'm sure many developers would appreciate using JSON schema validation with Mysql. I just don't think it is a flexible and inclusive enough approach. Prisma is not just a Mysql solution.

I don't understand if you're not willing to use that why you invest so much to debunk it. I mean, having JSON schema constraint will not do anything to you if you don't plan to use it. How is "not inclusive" something that you can decide to use it or not?

I'm not trying to debunk anything, I'm just sharing my opinions based on my personal experiences. I am very sorry if that came across in a way that would make you feel like that's the case.

My plan is to have a healthy discussion on the topic to finally come to a solution that everybody can enjoy.

So if I'm saying "not inclusive" then I'm addressing a rather complex feature to be built just for one DB provider and maybe a second one as well in case the environment (e.g. extensions) allows for that.
If a feature is optional has nothing to do with being inclusive.

Im not quite sure what you refer as "follow that path" as if having this feature would render the framework unusable for you or anyone else that doesn't use this, doesn't make any sense. It will be like any other feature that prisma have, if you don't want to use it, just don't declare any constraint and that's it

How you came from my wording to this conclusion is absolutely beyond me. On several occasions, I wrote that I'm not opposed to the idea in general, it's just not the direction I would go for.
Also, I even encouraged you to go on with a PoC.
I don't understand where this is coming from. Please read my answer again.

Prisma is not just a Mysql solution.

Neither a postgres one but they have PostgreSQL extensions and I don't see non-postgres user complaining.

I see quite a difference in feature complexity between installing an extension (which is one command) and integrating a validation and migration concept. But I get your point and I take this as a good argument.

It's a valid user case, maybe not for you, but for a lot of people (like me) is, it doesn't really affect anyone who don't want to use and it definitively doesn't affect the platforms that are outside of this scope.

As you only addressed a few of my answers, can I assume that we at least agree on the rest of our discussion?

As you only addressed a few of my answers, can I assume that we at least agree on the rest of our discussion?

No, but this is not the proper place to keep having these conversations because this ticket is not about the JSON schema constraints, #16654 is.

Loving Prisma in all other ways, but coming from TypeORM I tripped over this one pretty hard with columns like:

/** An array of [x, y] points that make up the block boundary. */
@Column({ type: 'jsonb' })
border: [number, number][]

This just worked in TypeORM. So I'm scrambling to come with some sort of type hack that swaps out Prisma.JsonValue fields for the fields on my nest.js DTOs and it's not going well.

So yeah, big +1 on this issue.

I managed a workaround that is a giant hack, but it seems to work.

This file parses the types that prisma generates and amends them by swapping out Prisma.JsonValue and InputJsonValue with Dto.MyModelNameDto['somePropName']. It, sadly does not do this by parsing the types into an AST, but parsing the text itself. So this is very likely to break if Prisma ever changes the internals of its type outputs.

So, use at your own risk and YMMV.


To make it work you'll want to have a supply of Dto types at the location of dtoImport that looks something like:

type BookDto = {
  title: string
  author: { firstName: string, lastName: string } // this is a Json field in your prisma schema
}

Then that should modify all the types in the prisma exports by changing:

type Book = {
  title: string
  author: Prisma.JsonValue
}

Into:

type Book = {
  title: string
  author: Dto.BookDto['author']
}

You'll want to add a script to package.json or somewhere that runs the prisma generator command, and then runs this script. That way the file exists for the script to modify.

If you want to use this you'll probably have to alter it to work for your codebase and usecase.

So for better or worse, I donate this to everyone who is otherwise blocked by this and willing use this disgusting hack.

import { readFileSync, writeFileSync } from 'fs'

/** This imports all Dto's that have the JSON types we need on them. */
const dtoImport = "import * as Dto from 'path/to/some/types/somewhere';"

/** This is the path of the prisma client types. */
const prismaTypesPath = './node_modules/.prisma/client/index.d.ts'

/**
 * These are the suffixes of the types we want to modify. The type names will match: `${ModelName}${Suffix}`
 *
 *      type MyModelCreateInput // for example
 */
const prismaTypeSuffixes = [
  'CreateInput',
  'UncheckedCreateInput',
  'UpdateInput',
  'UncheckedUpdateInput',
  'WhereUniqueInput',
  'CreateManyInput',
  'UpdateManyMutationInput',
  'UncheckedUpdateManyInput',
]

function fixJsonTypes() {
  // eslint-disable-next-line no-console
  console.log("Modifying Prisma JSON types to match Dto's...")

  let lines = readTypeFile()
  lines = addDtoImport(lines)
  lines = replaceModelFields(lines)
  writeResult(lines)
}

function readTypeFile(): string[] {
  return readFileSync(prismaTypesPath, 'utf8').toString().split('\n')
}

function addDtoImport(lines: string[]): string[] {
  return [dtoImport, ...lines]
}

/** Returns model names from the `export const ModelName` object in the prisma types file. */
function parseModelNames(lines: string[]): string[] {
  let inModelNames = false
  const modelNames: string[] = []

  for (const line of lines) {
    if (line.includes('export const ModelName')) {
      inModelNames = true
      continue
    }

    if (line.includes('}')) {
      inModelNames = false
      continue
    }

    if (inModelNames) modelNames.push(line.trim().split(':')[0])
  }
  return modelNames
}

/** Replace non specific JSON fields with specific types. */
function replaceModelFields(lines: string[]): string[] {
  const modelNames = parseModelNames(lines)
  let currentModelName: string | null = null

  return lines.map((line, index) => {
    if (!currentModelName) {
      currentModelName = parseTypeStart(line, modelNames)
      return line
    }

    if (parseTypeEnd(line)) {
      currentModelName = null
      return line
    }

    return replaceJsonValueType(line, index, currentModelName)
  })
}

/** Returns the model name from a model type declaration. If line is anything else return `null`. */
function parseTypeStart(line: string, modelNames: string[]): string | null {
  const modelTypeStartRegex = new RegExp(
    `export type (${modelNames.join('|')})(?:${prismaTypeSuffixes.join('|')})? = {`,
  )
  return line.match(modelTypeStartRegex)?.[1] ?? null
}

/** Returns true if this line terminates a type. */
function parseTypeEnd(line: string): boolean {
  return line.includes('}')
}

/** Replace any Json type on a line with the specific DTO types instead. */
function replaceJsonValueType(line: string, index: number, currentModelName: string): string {
  if (currentModelName) {
    const propName = line.match(/(\w+)\??:/)?.[1]
    if (!propName) throw new Error(`line ${index} expected prop and type, got: ${line}`)
    const propType = `Dto.${currentModelName}Dto['${propName}']`

    if (line.includes('Prisma.JsonValue') || line.includes('InputJsonValue')) {
      const newLine = line.replace('Prisma.JsonValue', propType).replace('InputJsonValue', propType)
      return newLine
    }
  }
  return line
}

function writeResult(lines: string[]): void {
  writeFileSync(prismaTypesPath, lines.join('\n'))
}

// Run when the file loads
fixJsonTypes()
commented

Another solution to that would be,
jsonData json<{ name: string }> @default("{}")

Please always keep in mind that JSON types could be recursive and the solution should provide the required expressiveness to define these.

Another simpler idea is the following: As other clients aside from typescript ones are being created, using typescript related syntax should be avoided.

What about just a sipler Json with name and the client itself defines types?

// schema.prisma

model A {
  field Json("MyStructure") /// or any other similar syntax
}
// main.ts

// This works for typescript, setup inside ts files.
// Go, for example, is probably going to be different.

declare namespace PrismaTypes {
  interface MyStructure {
    test: string;
  }
}

And the prisma generated ts file would be the following:

declare namespace PrismaTypes {}

type A {
  // this would emit an error until the tschecker
  // loads the main.ts namespace declaration
  field: PrismaTypes.MyStructure;
}

Of cource, it is just an declarative example to get the idea that any complex setup in the prisma schema may be impossible for some language / client. Later on, json validation and other things could be done...

Prisma just shipped a new feature called client extensions (preview) 🎉

You can now strongly type your JSON by defining them with a Zod schema for example.

E.g:

import { Prisma } from "@prisma/client";
import { Profile } from "./schemas";

const prisma = new PrismaClient().$extends({
  result: {
    user: {
      profileData: {
        needs: { profile: true },
        compute({ profile }) {
          return Profile.parse(profile);
        },
      },
    },
  },
}

Wouldn't this just strictly validate our JSON types instead of typing them? Because the emitted type declarations would still be of type any.

@arthurfiorette you can use Zod just to validate if you want, however .parse outputs a strongly typed data property, so the idea is that you can integrate your API layer (or whatever system is using Prisma) with that output rather than the raw Json that Prisma directly returns from the db. Should work well especially if you also validate your writes to the Json field using Zod.

@arthurfiorette yeah, not the ideal way but at least the output is strongly typed and the input is validated.
My workaround to get validated the input with the same schema as the output:

  const profile: Profile = {
    firstName: "Mark",
    lastName: "Test",
    contactInfo: { "email": "", phone: "" },
  }

  await prisma.user.update({
    where: { id: "12345" },
    data: {
      infos: profile
    }
  })

Not as good as the embed type in Prisma + MongoDB but at least it's easier to work with than other solutions IMO.

For those interested, this is how you can validate the input:

const prisma = new PrismaClient().$extends({
  query: {
    user: {
      create({ args, query }) {
        args.data.infos = profileSchema.parse(args.data.infos);
        return query(args);
      }
    },
  }
})

I joined a meeting with Matt Mueller PM @ Prisma last month and one of our talks was about this specific feature. They said it was in their roadmap and they already talked internally with their team about it.

As I talked with them before, this feature would help our company a lot. I'm also seeking to hear more about this being implemented in the near future.

Besides waiting, I don't know if we can do anything else to help.

That was on september... Any updates?

I decided to use some of my free time to code an temporary solution to this. Some refactors and I ended up with Prisma Json Types Generator.

⚠️ If you are using prisma v5+, please stick with prisma-json-types-generator@beta until an official 3.0.0 relase is ready. Current beta, beta.4 is stable and can already be used in production.

It just alters the emitted index.d.ts file to include the correct typings, so no runtime code is affected.

yarn add prisma-json-types-generator

any PRs are welcome :).

// schema.prisma

generator client {
  provider      = "prisma-client-js"
}

/// Always after the prisma-client-js generator
generator json {
  provider = "prisma-json-types-generator"
  // namespace = "PrismaJson"
  // clientOutput = "<finds it automatically>"
  // (./ -> relative to schema, or an importable path to require() it)
}

model Example {
  /// [MyType]
  normal Json

  /// [MyType]
  optional Json?

  /// [MyType]
  array Json[]
}
// index.ts

import type { Example } from '@prisma/client';

declare global {
  namespace PrismaJson {
    // you can use classes, interfaces, types, etc.
    type MyType = boolean;
  }
}

function myFunction(example: Example) {
  // example.normal   is now a  boolean
  // example.optional is now a  boolean | null
  // example.array    is now a  boolean[]
}

Having problems with prisma-json-types-generator? DO NOT use this issue thread and open a issue there.

@arthurfiorette this looks like an amazing and elegant solution. Just tried it and wow! Hats off!

Nice job @arthurfiorette with Prisma Json Types Generator.

Since the above seems to be the de facto solution until this ticket closes, I feel justified in asking this in this thread:

What are ES2015+ best practices for using this library? Specifically: how do I avoid using global.namespace.PrismaJson? Like I would just love to be able to reference an exported interface from any one of my .ts files. Is there an option to use ES2015 syntax to read the interface? If not, should I be declaring these in a .d.ts file to make it more clear and avoid violating @typescript-eslint/no-namespace.

Hey @irl-dan, thanks for your words :)

I used global namespaces to solve a simple problem:

For different package managers (yarn v1/v2, pnpm, npm and so on), and even for different contexts, like dev and the one inside CIs, the prisma client is emitted to different files and even folders.
Making it almost impossible to generate a correct import to the source code from node_modules. (Which should be an higher anti-pattern than using namespaces).

We could've do some hackish import alias in your tsconfig or some related post-processing of the type imports it would've generated, but that is a lot more work than just using something that typescript has built in.

I acknowledge @typescript-eslint/no-namespace purpose and even agree with it (ESModules >>>>). But the PrismaJson global namespace avoids having to import source code from node_modules and it is and it should be an type-only namespace which won't affect any runtime.

Declaring the namespace inside a dts file is an option that also works but sometimes requires changes inside tsconfig. I'll try to write something in the readme ASAIC to clarify this.

@arthurfiorette I cant seem to be able to access the global namespace I created from another file.
I created the global namespace in a package(internal package) and I'm trying to access it where I imported the package

@psecuresystem the prisma json types generator repo has a issues page, no need to flood this prisma repo.

I think this should now be possible with the client extensions:
image
https://twitter.com/prisma/status/1672234723769212930?s=46&t=8FLs07ZKHr7bDSpaRl2Ekw

You should be able to define a zod schema for your json and then validate on the extension.

Not sure if this is a great solution. Requires you to update for all query types

Client Extensions definitely are not the solution to this issue, but they probably make a more maintainable and working workaround than before was possible. We'll keep this issue open for the actual, proper type support for Json.

prisma-json-types-generator is working very well to solve this for now, you should try it.

I am using trpc and prisma together, and the json data returned by the server has no type, which is a serious problem. But I don't think prisma should be allowed to specify the type for json data. Maybe it is safer to use some tools to verify and specify the type. I found that using the result extension can effectively specify a type for a field.(for post a reply I'm using translation software)

import { PrismaClient } from "@prisma/client";
import { z } from "zod";

const goodsSchema = z
  .array(
    z.object({
      name: z.string(),
      count: z.number()
    })
  )
  .optional();
export const db = new PrismaClient().$extends({
  result: {
    warehouse: {
      goods: {
        needs: {goods: true },
        compute(warehouse) {
          // if you don't want to throw error you can use .safeParse
          return goodsSchema.parse(room.goods);
        },
      },
    },
  },
});

I am using trpc and prisma together, and the json data returned by the server has no type, which is a serious problem. But I don't think prisma should be allowed to specify the type for json data. Maybe it is safer to use some tools to verify and specify the type. I found that using the result extension can effectively specify a type for a field.(for post a reply I'm using translation software)

import { PrismaClient } from "@prisma/client";
import { z } from "zod";

const goodsSchema = z
  .array(
    z.object({
      name: z.string(),
      count: z.number()
    })
  )
  .optional();
export const db = new PrismaClient().$extends({
  result: {
    warehouse: {
      goods: {
        needs: {goods: true },
        compute(warehouse) {
          // if you don't want to throw error you can use .safeParse
          return goodsSchema.parse(room.goods);
        },
      },
    },
  },
});

I think that both approaches should not be incompatible. If you plan on having a predefined JSON structure, this code you wrote could be generated by Prisma.

Just a reminder: If you are using prisma v5+, please stick with prisma-json-types-generator@beta until an official 3.0.0 relase is ready. Current beta, beta.4 is stable and can already be used in production.

Prisma Json Types Generator@3+ is stable and ready to be used with Prisma@5+. We also removed support for Prisma@4.