Link to stackblitz project
I made a mini app to work with chatgpt API(hide the api key). It works, however if the question/answer is too big, it takes much time or even exceeds the token limit of chatgpt. Is it possible to get the response in stream chunk by chunk? I can't figure out how to do it. In the provided code I tried it but only receive the first chunk. If there is any solution, i'll be glad to get help
You can get response in stream with fetch and ReadableStream. Here is an example:
chatStream(url, body, apikey) {
return new Observable<string>(observer => {
fetch(url, {
method: 'POST',
body: body,
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apikey}`,
},
}).then(response => {
const reader = response.body?.getReader();
const decoder = new TextDecoder();
if (!response.ok) {
// handle response error
observer.error();
}
function push() {
return reader?.read().then(({ done, value }) => {
if (done) {
observer.complete();
return;
}
//parse text content from response
const events = decoder.decode(value).split('\n\n');
let content = '';
for (let i = 0; i < events.length; i++) {
const event = events[i];
if (event === 'data: [DONE]') break;
if (event && event.slice(0, 6) === 'data: ') {
const data = JSON.parse(event.slice(6));
content += data.choices[0].delta?.content || '';
}
}
observer.next(content);
push();
});
}
push();
}).catch((err: Error) => {
// handle fetch error
observer.error();
});
});
}
And then subscribe like this
let botMessage = ''
chatStream().subscribe({
next: (text) => {
botMessage += text
},
complete: () => {
},
error: () => {
}
});
Check out my complete application here. Each part can be found at app/@core/http-api.service.ts and app/pages/chat/chat.component.ts.
If you found this helpful, I would greatly appreciate it if you could give me a star.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With