-
Notifications
You must be signed in to change notification settings - Fork 328
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High-level explanation of API changes #453
Comments
The I think that speaking about a v3 is too early, the new modules / APIs must be validated first. I'm reserving the For what concerns the APIs changes, here's a tiny migration guide in order to help experimenting with the
|
Thanks - some follow-up questions:
Will untagged unions be supported at all?
Does that mean there's no longer such a thing as a
Will this affect the advice to use
Does that mean anything for #373? The request there was to keep un-branded refinements as an option. |
Will this be possible as Codec? I mean, the possibility to decode from two different types. export const FaunaDocRef = (() => {
const Serialized = t.type({
'@ref': t.type({
id: FaunaID,
collection: t.type({
'@ref': t.type({
id: t.string,
collection: t.type({
'@ref': t.type({ id: t.literal('collections') }),
}),
}),
}),
}),
});
const FaunaDocRef = t.type({
id: FaunaID,
collection: t.string,
});
type FaunaDocRef = t.TypeOf<typeof FaunaDocRef>;
return new t.Type<FaunaDocRef, values.Ref, unknown>(
'FaunaDocRef',
FaunaDocRef.is,
(u, c) => {
if (u instanceof values.Ref) {
return u.collection
? t.success({
id: u.id as FaunaID, // as FaunaID is ok, we don't create ids anyway
collection: u.collection.id,
})
: t.failure(u, c);
}
return either.either.chain(Serialized.validate(u, c), (s) =>
t.success({
id: s['@ref'].id,
collection: s['@ref'].collection['@ref'].id,
}),
);
},
(a) =>
new values.Ref(
a.id,
new values.Ref(a.collection, values.Native.COLLECTIONS),
),
);
})();
export type FaunaDocRef = t.TypeOf<typeof FaunaDocRef>; |
They are supported in So either you define an encoder by hand... import { left, right } from 'fp-ts/lib/Either'
import * as C from 'io-ts/lib/Codec'
import * as D from 'io-ts/lib/Decoder'
import * as G from 'io-ts/lib/Guard'
const NumberFromString: C.Codec<number> = C.make(
D.parse(D.string, (s) => {
const n = parseFloat(s)
return isNaN(n) ? left(`cannot decode ${JSON.stringify(s)}, should be NumberFromString`) : right(n)
}),
{ encode: String }
)
export const MyUnion: C.Codec<number | string> = C.make(D.union(NumberFromString, D.string), {
encode: (a) => (G.string.is(a) ? a : NumberFromString.encode(a))
}) ...or you extend import * as E from 'io-ts/lib/Encoder'
interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}
function make<A>(codec: C.Codec<A>, guard: G.Guard<A>): Compat<A> {
return {
is: guard.is,
decode: codec.decode,
encode: codec.encode
}
}
function union<A extends ReadonlyArray<unknown>>(
...members: { [K in keyof A]: Compat<A[K]> }
): Compat<A[number]> {
return {
is: G.guard.union(...members).is,
decode: D.decoder.union(...members).decode,
encode: (a) => {
for (const member of members) {
if (member.is(a)) {
return member.encode(a)
}
}
}
}
}
const string = make(C.string, G.string)
const NumberFromString2 = make(NumberFromString, G.number)
export const MyUnion2: Compat<number | string> = union(NumberFromString2, string) ..or... something else? I don't know, any idea?
Yes it is, but please note that basically
Not sure what you mean, but the new way is import * as D from 'io-ts/lib/Decoder'
const MyEnum = D.literal('a', 'b') and
The new export declare function refinement<A, B extends A>(
from: Decoder<A>,
refinement: (a: A) => a is B,
expected: string
): Decoder<B> where export declare function refinement<C extends Any>(
codec: C,
predicate: Predicate<TypeOf<C>>,
name?: string
): RefinementC<C> which I still consider a bad API since the predicate is not carried to the type level. |
@gcanti Will be JSON type possible? |
@steida Here is how I solved it using the suggestions from @gcanti above. import * as C from 'io-ts/lib/Codec';
import * as D from 'io-ts/lib/Decoder';
import * as E from 'io-ts/lib/Encoder';
import * as G from 'io-ts/lib/Guard';
export interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}
export const makeCompat: <A>(c: C.Codec<A>, g: G.Guard<A>) => Compat<A> = (c, g) => ({
is: g.is,
decode: c.decode,
encode: c.encode,
});
export const lazy = <A>(id: string, f: () => Compat<A>): Compat<A> => {
return makeCompat(C.lazy(id, f), G.guard.lazy(id, f));
};
export const untaggedUnion: <A extends ReadonlyArray<unknown>>(
...ms: { [K in keyof A]: Compat<A[K]> }
) => Compat<A[number]> = (...ms) => ({
is: G.guard.union(...ms).is,
decode: D.decoder.union(...ms).decode,
encode: (a) => ms.find((m) => m.is(a)),
});
type Json = string | number | boolean | null | { [property: string]: Json } | Json[];
const Json: Compat<Json> = lazy<Json>('Json', () =>
untaggedUnion(
makeCompat(C.string, G.string),
makeCompat(C.number, G.number),
makeCompat(C.boolean, G.boolean),
makeCompat(C.literal(null), G.literal(null)),
makeCompat(C.record(Json), G.record(Json)),
makeCompat(C.array(Json), G.array(Json)),
),
);
const json = Json.decode([1, [[['1']], { a: 1, b: false }]]);
console.log(json);
// {
// _tag: 'Right',
// right: [ 1, [ [ [ '1' ] ], { a: 1, b: false } ] ]
// } |
@IMax153 @steida the import * as C from 'io-ts/lib/Codec'
import * as D from 'io-ts/lib/Decoder'
import * as E from 'io-ts/lib/Encoder'
type Json = string | number | boolean | null | { [key: string]: Json } | Array<Json>
const JsonDecoder = D.lazy<Json>('Json', () =>
D.union(C.string, C.number, C.boolean, C.literal(null), C.record(Json), C.array(Json))
)
const Json: C.Codec<Json> = C.make(JsonDecoder, E.id) |
@gcanti maybe there could be a special case for unions of codecs with t.union([t.string, t.number, t.type({ myProp: t.boolean })]) ☝️ that's assuming it's possible to "propagate" the identity encoder from props, i.e.
Not in the case of |
@mmkal maybe, but supporting untagged unions means that the implementation of declare function union<A extends ReadonlyArray<unknown>>(
...members: { [K in keyof A]: Encoder<A[K]> }
): Encoder<A[number]> should work with any
I think that most of the times declare function parse<A, B>(from: Decoder<A>, parser: (a: A) => Either<string, B>): Decoder<B> |
I think it would be useful to include a const pipe = <A>(da: D.Decoder<A>) => <B>(db: D.Decoder<B>): D.Decoder<B> => ({
decode: flow(da.decode, E.chain(db.decode)),
}); Or, I dunno, would calling that |
Another thing I miss from the new API is a export const decode = <A>(
decoder: Decoder<A>,
errorMetadata?: Record<string, unknown>
) =>
flow(
decoder.decode,
E.mapLeft(error => makeIoTsDecodeError(error, errorMetadata))
); I used to be able to automatically smoosh in the makeIoTsDecodeError(error, { decoderName: decoder.name, ...errorMetadata } But that's not possible with I suppose I could extend |
example?
Because it makes the APIs unnecessarily complicated, IMO the error messages are readable even without the names (and you can always use |
I know In some ways this feels similar to the discussion we had recently about re-adding the second type parameter to
Yeah I think i need to experiment with that a bit - I guess I would use it to replace the text inside a leaf? |
@gcanti re const MyEvent = kinesisEvent(t.type({ foo: t.string })) The
Is this possible with the new Another question, since this issue title is "High-level explanation of API changes". Could you give a recommendation for users who rely on |
Yes, but there are many different ways to get the final result so I guess it really depends on your coding style ("everything is a decoder" or "I want to lift by business / parsing logic only once"?). For example import * as E from 'fp-ts/lib/Either'
import { Json } from 'io-ts/lib/JsonEncoder'
import * as D from 'io-ts/lib/Decoder'
import { flow } from 'fp-ts/lib/function'
// > decodes base 64
declare function decodeBase64(s: string): E.Either<string, string>
// > parses the decoded string as JSON
declare function parseJSON(s: string): E.Either<string, Json>
// > then validates the json using io-ts
declare function decodeItem(json: Json): E.Either<string, { foo: string }>
const parser = flow(decodeBase64, E.chain(parseJSON), E.chain(decodeItem))
export const X = D.parse(D.string, parser)
Actually one of the goals of my rewrite was to get rid of those meta infos (at the type level) |
@leemhenson @mmkal given the good results in #478 I'm going to make the following breaking changes:
parse from declare export function parse<A, B>(from: Decoder<A>, parser: (a: A) => Either<string, B>): Decoder<B> to declare export function parse<A, B>(parser: (a: A) => E.Either<DecodeError, B>): (from: Decoder<A>) => Decoder<B> Pros:
import { pipe } from 'fp-ts/lib/pipeable'
import * as D from '../src/Decoder2'
import { Json } from '../src/JsonEncoder'
// > decodes base 64
declare const Base64: D.Decoder<string>
// > parses the decoded string as JSON
declare const Json: D.Decoder<Json>
// > then validates the json using io-ts
declare const Item: D.Decoder<{ foo: string }>
export const X = pipe(D.string, D.parse(Base64.decode), D.parse(Json.decode), D.parse(Item.decode)) |
Are you changing the signatue of export interface Decoder<A> {
readonly decode: (u: unknown) => Either<DecodeError, A>
} to: export interface Decoder<A, B = unknown> {
readonly decode: (u: B) => Either<DecodeError, A>
} ? Otherwise |
@leemhenson isn't what happens in your proposal too? const pipe = <A>(da: D.Decoder<A>) => <B>(db: D.Decoder<B>): D.Decoder<B> => ({
decode: flow(da.decode, E.chain(db.decode)),
}); |
Yes, I never said mine was optimal 😅 . I'm just re-raising the point I made earlier:
If we did that, then the piped decoders wouldn't need to keep checking the same things over and over. |
Why is that in bold? Emphasis not mine! 😬 |
@leemhenson maybe it's just a bias of mine but I consider a "proper decoder" an arrow that goes from
for some effect
and I use So personally I would model my pipeline starting from normal kleisli arrows and then I would define a suitable decoder based on the use case at hand. // kleisli arrows in my domain
declare function decodeBase64(s: string): E.Either<string, string>
declare function parseJSON(s: string): E.Either<string, Json>
declare function decodeItem(json: Json): E.Either<string, { foo: string }>
// I can compose them as usual using the `Monad` instance of `Either`
const decode = flow(decodeBase64, E.chain(parseJSON), E.chain(decodeItem))
// and then define my decoder
export const MyDecoder = pipe(
D.string,
D.parse((s) =>
pipe(
decode(s),
E.mapLeft((e) => D.error(s, e))
)
)
) Alternatively I could have already defined some decoders declare const Base64: D.Decoder<string>
declare const Json: D.Decoder<Json>
declare const Item: D.Decoder<{ foo: string }> if this is the case, again I can compose them via // and I can compose them via parse
export const MyDecoder2 = pipe(Base64, D.parse(Json.decode), D.parse(Item.decode))
// or even this if I want micro optimizations
export const MyDecoder3 = pipe(
Base64,
D.parse((s) =>
pipe(
parseJSON(s),
E.mapLeft((e) => D.error(s, e))
)
),
D.parse(Item.decode)
) Do we really need something more? Genuine question, I'm open to suggestions if you think there's an ergonomic issue with the current APIs. In the end if you can define a export interface Decoder<A, B> {
readonly decode: (a: A) => Either<DecodeError, B>
} then you can just define |
It's not a major issue, just a niggle I keep encountering because I have scenarios like these:
So in this case I would like to have primitive decoders: intFromNumber: Decoder<Int, number>
intFromString: Decoder<Int, string>
fooIdFromInt: Decoder<FooId, Int>
centsFromInt: Decoder<Cents, Int> then I can compose them together to make complex ones: pipe(
stringFromUnknown,
intFromString,
fooIdFromInt,
) // => Decoder<FooId, unknown> I could wrap all that logic up using But, hey, that's just me. I might just be using the wrong tool for the job. 🤷 |
Well, while I love
that will be future-proof even if a I replace In elm-ts I removed the hard dependency on But that's just a point of view, yours is sensible too. Let me just think more about all of this... |
Yeah I'm only talking about composition of |
However the converse is also true, so my POV is actually biased and without noticeable substance, @leemhenson I'll reconsider my position. |
@leemhenson while experimenting with kleisli arrows looks like I found something more general than interface Kleisli<M extends URIS2, I, E, A> {
readonly decode: (i: I) => Kind2<M, E, A>
} for which I can define a const compose = <M extends URIS2, E>(M: Monad2C<M, E>) => <A, B>(ab: Kleisli<M, A, E, B>) => <I>(
ia: Kleisli<M, I, E, A>
): Kleisli<M, I, E, B> => ({
decode: (i) => M.chain(ia.decode(i), ab.decode)
}) Then from interface KleisliDecoder<I, A> extends K.Kleisli<E.URI, I, DecodeError, A> {}` and from interface Decoder<A> extends KD.KleisliDecoder<unknown, A> {} Example import { pipe } from 'fp-ts/lib/pipeable'
import * as D from '../src/Decoder'
import * as KD from '../src/KleisliDecoder'
interface IntBrand {
readonly Int: unique symbol
}
type Int = number & IntBrand
interface CentsBrand {
readonly Cents: unique symbol
}
type Cents = number & CentsBrand
declare const IntFromString: KD.KleisliDecoder<string, Int>
declare const CentsFromInt: KD.KleisliDecoder<Int, Cents>
// const result: D.Decoder<Cents>
export const result = pipe(
D.string,
D.compose(IntFromString),
D.compose(CentsFromInt)
) |
/*
const kdecoder: KD.KleisliDecoder<{
name: unknown;
age: string;
cents: Int;
}, {
name: string;
age: Int;
cents: Cents;
}>
*/
export const kdecoder = KD.type({
name: D.string,
age: IntFromString,
cents: CentsFromInt
}) EDIT: same for // const kdecoder2: KD.KleisliDecoder<[unknown, string, Int], [string, Int, Cents]>
export const kdecoder2 = KD.tuple(D.string, IntFromString, CentsFromInt) |
v2.2.7 released https://github.com/gcanti/io-ts/releases/tag/2.2.7 |
I noticed there's a schemable instance for the old
|
@treybrisbane that's an interesting approach and I agree that there are good use cases for it. What's the difference between a decoder and an encoder though? type Decoder<I, E, A> = (i: I) => Either<E, A>
type Encoder<I, A> = (i: I) => A Precisely the fact that encoders can't fail. So if we allow encoders to possibly fail then I think we can just unify decoders and encoders: they are just all decoders. import { Either, right } from 'fp-ts/lib/Either'
export type Decoder<I, E, A> = (i: I) => Either<E, A>
export type DecodeError = string
// which one is a decoder"? or an "encoder"? it doesn't matter anymore
export const NumberFromString: Decoder<string, DecodeError, number> = (s) =>
right(parseFloat(s))
export const StringFromNumber: Decoder<number, DecodeError, string> = (n) =>
right(String(n)) |
@treybrisbane I'm not sure I understand how it works, let's say we unify decoders and encoders and we have the following: import { pipe } from 'fp-ts/function'
import * as D from 'io-ts/Decoder'
// const trim: D.Decoder<string, string>
export const trim = pipe(
D.id<string>(),
D.map((s) => s.trim())
)
// const double: D.Decoder<number, number>
export const double = pipe(
D.id<number>(),
D.map((n) => n * 2)
) how would you define a decoder: |
@gcanti I'm assuming If so, you'd need to replace import { pipe } from 'fp-ts/function';
import * as E from 'fp-ts/Either';
import * as D from 'io-ts/Decoder';
const union = <I1, E1, A1, I2, E2, A2>(
decoder1: D.Decoder<I1, E1, A1>,
decoder2: D.Decoder<I2, E2, A2>,
): D.Decoder<I1 | I2, E1 | E2, A1 | A2> =>
pipe(
decoder1,
E.orElse(decoder2),
);
const trimOrDouble = union(trim, double); (I'm rushing this so I've probably made a mistake, but the idea is that you run the first decoder, return the result if it's successful, otherwise run the second decoder, return the result if it's successful, etc.) |
I did not find an example for a generic Decoder, so I experimented. Maybe this will be useful for someone. const endpoint = <
A extends Record<string, unknown>,
R extends Record<string, unknown>,
>({
args = {} as { [K in keyof A]: D.Decoder<unknown, A[K]> },
result = {} as { [K in keyof R]: D.Decoder<unknown, R[K]> },
}: {
args?: { [K in keyof A]: D.Decoder<unknown, A[K]> };
result?: { [K in keyof R]: D.Decoder<unknown, R[K]> };
}) => ({
args: D.struct(args),
result: D.struct(result),
});
const api = {
randomThing: endpoint({
result: { thing: Thing },
}),
thingById: endpoint({
args: { id: D.string },
result: { thing2: Thing },
}),
fooThing: endpoint({
args: { id: D.string },
}),
}; |
@gcanti Is it possible to define or enforce an empty struct? The use case is an endpoint with const EmptyStruct = C.struct({});
// This should be an error, IMHO.
EmptyStruct.encode({a: 1}) |
@steida I would define a custom decoder |
Hi all, I've been working on a new version of the This time I started from a bunch of use cases and long standing issues. The first two biggest changes are:
therefore the export interface Decoder<I, A> {
readonly decode: (i: I) => Either<DecodeError, A>
} to: export interface Decoder<I, E, A> {
readonly decode: (i: I) => These<E, A>
} which should unlock the possibility to:
There are many other use cases I'm trying to solve but I want to start from this list to get early feedback from you all. You can find the source code in the customization of error types Let's say we want to check the minimum length of a string: // the model of the custom error
export interface MinLengthE<N extends number> {
readonly _tag: 'MinLengthE'
readonly minLength: N
readonly actual: string
} all custom errors must be wrapped in a export interface MinLengthLE<N extends number> extends LeafE<MinLengthE<N>> {}
// constructor
export const minLengthLE = <N extends number>(minLength: N, actual: string): MinLengthLE<N> =>
leafE({ _tag: 'MinLengthE', minLength, actual }) now I can define my custom combinator: export const minLength = <N extends number>(minLength: N): Decoder<string, MinLengthLE<N>, string> => ({
decode: (s) => (s.length >= minLength ? success(s) : failure(minLengthLE(minLength, s)))
})
const string3 = minLength(3)
assert.deepStrictEqual(string3.decode('abc'), success('abc'))
assert.deepStrictEqual(string3.decode('a'), failure(minLengthLE(3, 'a'))) make Let's use export const PersonForm = fromStruct({
name: string3,
age: number
})
/*
const PersonForm: FromStructD<{
name: Decoder<string, MinLengthLE<3>, string>;
age: numberUD;
}>
*/ The decoding error is fully typed, this means that you can pattern match on the error: export const formatPersonFormE = (de: ErrorOf<typeof PersonForm>): string =>
de.errors
.map((e): string => {
switch (e.key) {
case 'name':
// this is of type `MinLengthE<3>` ---v
return `invalid name, must be ${e.error.error.minLength} or more characters long`
case 'age':
return 'invalid age'
}
})
.join(', ')
assert.deepStrictEqual(
pipe(PersonForm.decode({ name: 'name', age: 18 }), TH.mapLeft(formatPersonFormE)),
success({ name: 'name', age: 18 })
)
assert.deepStrictEqual(
pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(formatPersonFormE)),
failure('invalid name, must be 3 or more characters long')
)
assert.deepStrictEqual(
pipe(PersonForm.decode({ name: '', age: null }), TH.mapLeft(formatPersonFormE)),
failure('invalid name, must be 3 or more characters long, invalid age')
) return warnings other than errors The
export const formatNumberE = (de: ErrorOf<typeof number>): string => {
switch (de.error._tag) {
case 'NumberE':
return 'the input is not even a number'
case 'NaNE':
return 'the input is NaN'
case 'InfinityE':
return 'the input is Infinity'
}
}
assert.deepStrictEqual(pipe(number.decode(1), TH.mapLeft(formatNumberE)), success(1))
assert.deepStrictEqual(pipe(number.decode(null), TH.mapLeft(formatNumberE)), failure('the input is not even a number'))
assert.deepStrictEqual(pipe(number.decode(NaN), TH.mapLeft(formatNumberE)), warning('the input is NaN', NaN)) optionally fail on additional properties Additional properties are still stripped out, but they are also reported as warnings: export const A = struct({ a: string })
assert.deepStrictEqual(
// v-- this utility transforms a decoding error into a tree
pipe(A.decode({ a: 'a', c: true }), draw),
warning('1 error(s) found while checking keys\n└─ unexpected key "c"', { a: 'a' })
// warning ---^ ^-- stripped out result
) Then you can choose to "absolve" the Since additional properties are reported as warnings rather than errors, this mechanism play well with intersections too. export const B = struct({ b: number })
export const AB = pipe(A, intersect(B))
assert.deepStrictEqual(
pipe(AB.decode({ a: 'a', b: 1, c: true }), draw),
warning(
`2 error(s) found while decoding (intersection)
├─ 1 error(s) found while decoding member 0
│ └─ 1 error(s) found while checking keys
│ └─ unexpected key "c"
└─ 1 error(s) found while decoding member 1
└─ 1 error(s) found while checking keys
└─ unexpected key "c"`,
{ a: 'a', b: 1 }
)
) ^ here only the |
Why is that? Because the error type in A possible solution is to define a sum type representing all errors: // I can pattern match on `DecodeError` while retaining the possibility to define custom errors
export type DecodeError<E> =
| UnexpectedKeysE
| MissingKeysE
| UnexpectedIndexesE
| MissingIndexesE
| LeafE<E> // <= leaf error
| NullableE<E>
| etc... where When I try to define a declare const toTree: <E>(de: DecodeError<E>) => Tree<string> I also need a way to transform a generic export declare const toTreeWith: <E>(toTree: (e: E) => Tree<string>) => (de: DecodeError<E>) => Tree<string> Now I can define const toTree = toTreeWith(toTreeBuiltin) where But what if the decoding error contains a custom error? Fortunately the whole mechanism is type safe and I get a typescript error: pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(toTree)) // Type 'MinLengthLE<3>' is not assignable to type 'LeafE<BuiltinE>' As a fix I can define my custom // my custom error --v
export const myToTree = toTreeWith((e: BuiltinE | MinLengthE<number>) => {
switch (e._tag) {
case 'MinLengthE':
return tree(`cannot decode ${format(e.actual)}, must be ${e.minLength} or more characters long`)
default:
return toTreeBuiltin(e)
}
})
assert.deepStrictEqual(
pipe(PersonForm.decode({ name: '', age: 18 }), TH.mapLeft(myToTree)),
failure(
tree('1 error(s) found while decoding (struct)', [
tree('1 error(s) found while decoding required key "name"', [
tree('cannot decode "", must be 3 or more characters long')
])
])
)
) A notable decoding error is use case: old decoder, custom error message // throw away utiliy for this issue
export const toString = flow(draw, print)
export const mystring = pipe(
string,
mapLeft(() => message(`please insert a string`))
)
assert.deepStrictEqual(
pipe(string.decode(null), toString),
`Errors:
cannot decode null, expected a string` // <= default message
)
assert.deepStrictEqual(
pipe(mystring.decode(null), toString),
`Errors:
please insert a string` // <= custom message
) use case: new decoder, custom error message The export const date: Decoder<unknown, MessageLE, Date> = {
decode: (u) => (u instanceof Date ? success(u) : failure(message('not a Date')))
}
assert.deepStrictEqual(
pipe(date.decode(null), toString),
`Errors:
not a Date`
) use case: new decoder, multiple custom messages (#487) export interface UsernameBrand {
readonly Username: unique symbol
}
export type Username = string & UsernameBrand
const USERNAME_REGEX = /(a|b)*d/
export const Username = pipe(
mystring,
compose({
decode: (s) =>
s.length < 2
? failure(message('too short'))
: s.length > 4
? failure(message('too long'))
: USERNAME_REGEX.test(s)
? failure(message('bad characters'))
: success(s as Username)
})
)
assert.deepStrictEqual(
pipe(tuple(Username, Username, Username, Username, Username).decode([null, 'a', 'bbbbb', 'abd', 'ok']), toString),
`Errors:
4 error(s) found while decoding (tuple)
├─ 1 error(s) found while decoding required component 0
│ └─ please insert a string
├─ 1 error(s) found while decoding required component 1
│ └─ too short
├─ 1 error(s) found while decoding required component 2
│ └─ too long
└─ 1 error(s) found while decoding required component 3
└─ bad characters`
) |
It seems C.sum is not type-safe, while D.sum is. I can mistype 'type' or 'A', and TS is silent. const MySum = C.sum('type')({
A: C.struct({ type: C.literal('A'), a: C.string }),
B: C.struct({ type: C.literal('B'), b: C.number }),
}); |
@gcanti the |
@gcanti I just tried |
link to the poc for future reader: |
@gcanti Wanted to report a use case for consideration when developing the experimental interfaces, and to see if there's a workaround we're missing. We use the Is it possible to derive a Our best attempt so far using a const [DomainModelDecoder, DomainModelGuard] = pipe(
iotsDecoderInterpreter(ApiSchema), // Decoder derived from schema.
(apiOutputDecoder) => {
const decoderWithDomainTransformation = pipe(
apiOutputDecoder,
iotsDecoder.map((resourceApiOutput) => /* Domain transformation logic. */)
)
const guardFromDecoder = (
decoder: typeof decoderWithDomainTransformation
) => (
value: unknown
): value is iotsDecoder.TypeOf<typeof decoderWithDomainTransformation> =>
isRight(decoder.decode(value))
return [
decoderWithDomainTransformation,
guardFromDecoder(decoderWithDomainTransformation),
]
}
) |
Is it still possible to create a untagged union with the help of a guard? Can't figure out what the new signature of interface Compat<A> extends D.Decoder<A>, E.Encoder<A>, G.Guard<A> {}
function make<A>(codec: C.Codec<A>, guard: G.Guard<A>): Compat<A> {
return {
is: guard.is,
decode: codec.decode,
encode: codec.encode
}
}
function union<A extends ReadonlyArray<unknown>>(
...members: { [K in keyof A]: Compat<A[K]> }
): Compat<A[number]> {
return {
is: G.guard.union(...members).is,
decode: D.decoder.union(...members).decode,
encode: (a) => {
for (const member of members) {
if (member.is(a)) {
return member.encode(a)
}
}
}
}
} |
@gcanti thank for this new API. We like it very much. 1. Do you have plans to graduate it to stable and 2. When you do, would you consider adding top-level |
@florianbepunkt I have done basically this with the POC if you're interested: Schemable:
Schema:
Type:
And Eq (bonus!):
|
As far as I can tell, the
Does that about cover it? I'd happily make a PR to address these issues if that would help! I really really like these features - this is my number one dream for the fp-ts ecosystem - and I'd like to do whatever I can to help them get published in the experimental modules! -- Just to address the comments suggesting changes: @safareli (comment) - I think you're suggesting something like const test = pipe(
D.struct({ a: D.string }),
D.intersect(D.struct({ b: D.number })),
D.mapLeft(NEA.of),
D.mapLeft(DE.compoundE('custom error'))
) @fernandomrtnz (comment) - Guards can be derived from schemas. Would it be possible to go from domain model schema -> @jamiehodge (comment) - this has been a feature since #507 @kalda341 (comment) - I think this same behavior could be achieved using a 'Decoder.refine' for each case const union = D.union(
pipe(
D.id<unknown>(),
D.refine((u: unknown): u is string => typeof u === 'string', DE.message('string'))
),
pipe(
D.id<unknown>(),
D.refine((u: unknown): u is number => typeof u === 'number', DE.message('number'))
)
) The interfaces for Decoder, Encoder, or any other Schema all imply the behavior of narrowing one type into another. Is there an advantage to requiring all |
@anthonyjoeseph The key is being able to define a schema which supports untagged unions for Eq and Decoder. 99.9% of my schemas are fine, however there are a few deeply nested hairy bits that I don't have a lot of control over that require me to use union. Refinements do seem to be the way to go though (I didn't really understand them up until now), so I'll have a go at refactoring my implementation to use them. Your PR plan sounds great BTW, it's just union that is killing me! Update: Refine doesn't allow me to actually specify the schema which should apply to the refined type, so it doesn't work for my use case. For more context, the types I need to union look like: interface A {
a: string;
b: string;
}
interface B {
c: string;
d: number;
} |
@kalda341 afaict untagged unions of structs (or anything else) have always been supported by Decoder, const decoder: Decoder<unknown, A | B> = Dec.union(
Dec.struct({ a: Dec.string, b: Dec.string }),
Dec.struct({ c: Dec.string, d: Dec.string })
) I just saw this comment, which helped me understand your solution - a more generalized |
Hi all, I am curious to know if there is a plan when the experimental stage will end up? I like the new way of creating codecs etc, but I am not sure if I can use it in production, without worrying that a next release will break my code. I see this issue/ticket is opened since 2020. |
I made this PR as an attempt to complete the poc branch. These kinds of PRs ultimately end up creating more work for the maintainers - however, I understand the community's concern that io-ts has stagnated (see here #635) and I'd like to surface the work that has been ongoing but less visible (it might also save other developers from re-inventing the wheel #636) The PR is against the 'poc' branch so that a clear diff can be seen of my changes vs gcanti's, although the intention is for this to be ultimately merged into master as a continuation of the 'experimental' modules I've also updated the READMEs to document these changes, and added one for Since the whole point is to 'de-stagnate' io-ts, my hope is to focus further discussion on smaller tweaks & implementation details, and have additional major changes or additions made as their own PRs (the luxury of having these features marked as 'experimental') As per the notes in the poc branch, I've moved all |
I know this is a little tangental, but I just wanna restate that I feel encoding should be a potentially failing operation (previously mentioned above). |
@treybrisbane it is - a codec is now a decoder and it's dual packed together (as per gcanti's changes) Also, please feel free to post questions like this on the PR itself - hopefully it's a bit more accessible, since the code & its documentation can serve as a record of progress rather than having to parse dozens of comments |
Sorry to be bothersome, but I wonder whether the explicit errors work shouldn't be delayed until another major release, allowing the current, well-functioning experimental API to pass into stability. It might sounds childish, but I really wish I could import Codec/Decoder/etc. directly from |
📖 Documentation
Would it be possible to add a paragraph or two to the readme, on the differences between "experimental" (Decoder/Encoder/Codec/Schema) and "stable" (Type), and who should use which? Assuming the concept of
Type
will be removed in v3, does it mean usage likeWill need to change? If so, will there be a migration guide (I assume
Codec
largely replacesType
)? I've seen theS.make(S => ...)
syntax around in a few places, but it's not immediately clear if applications relying on io-ts will need to use it, or if it's mainly a low-level construct.A link to a tracking issue could also work.
The text was updated successfully, but these errors were encountered: