-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix fileuploads with Hapi #4943
Conversation
We are currently planning to remove the direct integration with If there's something about how Note that it does look like the |
Hi and thank you for your answer. There is one thing that I don't know how to deal with when using What would you think about something like a const { graphqlResponse, responseInit } = await runHttpQuery(
[request, h],
{
method: request.method.toUpperCase(),
options: options.graphqlOptions,
query: options.getQuery ? options.getQuery(request) :
request.method === 'post'
? // TODO type payload as string or Record
(request.payload as any)
: request.query,
request: convertNodeHttpToRequest(request.raw.req),
},
); |
Oh, and @glasser in case you don't get notified of closed PRs. |
I'm not sure I understand the intricacies of hapi well enough to have great intuition here. The pattern that graphql-upload seems to generally take is putting its output on the request, but if hapi is more immutable than that then I guess that's a problem? I don't suppose there's a way to hook into hapi's own parse situation to get the right thing on payload in the first place? But if not, sure, a hapi-specific hook like that (maybe with "hapi" in the name) could work. |
The hook apollo-server-hapi currently uses is run before hapis own parsing. I'll look into it to see if there's a hook either specifically for the parsing or at least after so that we maybe could monkeypatch its results. If both fails I'll probably come back with a PR for an option like above. I got pulled to another project now though so it will be a couple of weeks before I can work on this again. |
Huh, odd, the current hook reimplements body parsing instead of letting hapi deal with it? That seems non ideal, ah well. Another option would be to do something to your line above but just call it, like, parsedBody rather than fileUploads? (Naming here is tough since the name "query" is a bit of a double misnomer — it's actually a record containing various keys including "query", which itself can contain non-"query" operations!) |
Yes, the way I could not get using This gives the best behavior but is not optimal from a maintenance perspective. jaydenseric/graphql-upload#5 (comment) opened up for exporting a function from server.ext({
type: "onPreHandler",
method: async function (request, h) {
if (request.path !== apolloServer.graphqlPath) {
return h.continue
}
if (typeis(request.raw.req, ["multipart/form-data"])) {
const response = handleFileUpload(request, h)
if (response) return response
}
return h.continue
},
})
function handleFileUpload(request: Request, h: ResponseToolkit) {
let operations: any
try {
operations = JSON.parse((request.payload as any).operations)
} catch (error) {
return h
.response(`Invalid JSON in the ‘operations’ multipart field`)
.code(400)
.takeover()
}
if (!(operations !== null && typeof operations === "object")) {
return h
.response(`Invalid type for the ‘operations’ multipart field`)
.code(400)
.takeover()
}
const operationsPath = objectPath(operations)
let parsedMap: any
try {
parsedMap = JSON.parse((request.payload as any).map)
} catch (error) {
return h
.response(`Invalid JSON in the ‘map’ multipart field`)
.code(400)
.takeover()
}
if (!(parsedMap !== null && typeof parsedMap === "object")) {
return h
.response(`Invalid type for the ‘map’ multipart field`)
.code(400)
.takeover()
}
const map = new Map<string, Upload>()
for (const [fieldName, paths] of Object.entries(parsedMap)) {
if (!Array.isArray(paths)) {
return h
.response(
`Invalid type for the ‘map’ multipart field entry key ‘${fieldName}’ array`,
)
.code(400)
.takeover()
}
const stream = (request.payload as any)[fieldName]
const contentType = Content.type(stream.hapi.headers["content-type"])
const upload = new Upload()
upload.resolve({
filename: stream.hapi.filename,
encoding: contentType.encoding,
mimetype: contentType.mime,
createReadStream: () => stream,
})
map.set(fieldName, upload)
for (const [index, path] of paths.entries()) {
if (typeof path !== "string") {
return h
.response(
`Invalid type for the ‘map’ multipart field entry key ‘${fieldName}’ array index ‘${index}’ value`,
)
.code(400)
.takeover()
}
try {
operationsPath.set(path, map.get(fieldName))
} catch (error) {
return h
.response(
`Invalid object path for the ‘map’ multipart field entry key ‘${fieldName}’ array index ‘${index}’ value ‘${path}’`,
)
.code(400)
.takeover()
}
}
}
;(request as any).payload = operations
} |
The current integration has two problems that causes it to not work at al.
1, request.mime is null for multipart requests, causing processFileUploads to never be called
2. You can not set the the payload property. Currently it's created as readonly which causes a crash when Hapi later tries to set it and if I change it to writeable, Hapi will overwrite it.
I also copied the version filtering form the test in express as that applies here as well.
Fixes #3267