0

I have an express server running which is sending data with res.write()

Fetcher.js

function fetcher() {
  console.log("called fetcher")
    fetch('/units.html',
     {
       method: "POST",
       body: JSON.stringify({
         "NAME": document.getElementById("NAME").value,
       }),
       //headers: { "Content-Type": "application/json" }
       }
   )

   .then(function(response) {
     console.log("inside middle function")
     return response.text();
   })
   .then(function(data) {
    console.log(data)
   

   });
 }

index.js

app.post('/units.html', function(req, res, next) {

  res.write("hello once")
  res.write("hello twice");
  res.end()
  
});

I want to do something every time I run res.write.

I see "hello oncehello twice" in browser console, which tells me that all the data is passed after res.end is executed.

Is there a way to execute console.log whenever res.write is executed?

Server => res.write("hello once")
Client => console.log("hello once")
Server => res.write("hello twice")
Client => console.log("hello twice")
Server => res.end()
3

1 Answer 1

2

Modern browsers (Chrome, Edge, Firefox) do allow you to get access to the incoming stream retrieved by a fetch() call. The MDN doc is here, but I found that quite confusing and had to use several other internet sources to figure out how to make it work.

Here's a simple example.

First, the server that writes some strings to a response separated by a delay when /stream is requested:

const express = require('express');
const app = express();
const path = require('path');

function delay(t, v) {
    return new Promise(resolve => {
        setTimeout(resolve, t, v);
    });
}

app.get("/", (req, res) => {
    res.sendFile(path.resolve("temp.html"));
});

// output separate pieces of data with a delay between them
app.get("/stream", async (req, res) => {
    // must set content-type for this to work in Firefox
    res.setHeader('Content-Type', 'text/plain');

    res.write("hello1");
    await delay(1000);
    res.write("hello2");
    await delay(1000);
    res.write("hello3");
    await delay(1000);
    res.end("hello4");
});

app.listen(80);

Then, the "/" HTML page from this web server, that page then makes a request to /stream:

<!doctype html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <link rel="icon" href="data:,">
</head>
<body>
    This is an empty page
    <script>

    function Decodeuint8arr(uint8array){
        return new TextDecoder("utf-8").decode(uint8array);
    }

    async function run() {
        const response = await fetch('/stream');
        const reader = response.body.getReader();
        while (true) {
            const { value, done } = await reader.read();
            if (done) break;
            console.log(`Received: "${Decodeuint8arr(value)}"`);
        }
        console.log('Response fully received');
    }
    run().catch(err => {
        console.log(err);
    });

    </script>
</body>
</html>

When I request this page in Chrome, I get this output:

Received: "hello1"
Received: "hello2"
Received: "hello3"
Received: "hello4"
Response fully received

So, you can get incoming data as it arrives on the stream. Keep in mind that there are no particular guarantees about the boundaries between packets that arrive. This happens to show each packet nice and neatly separate, but that is only because I'm sending a very small piece of data that nicely fits in one packet and I'm sending it over a very fast network. If the data was larger, it may not arrive in just one chunk.

So, for anything other than very simple uses, you will need at least some form of simple "protocol" to your data format so you can for sure discern where one packet ends and the next begins and have some code that parses for that. For example, you could delineate each packet with some special character (like a linefeed) and your reading code would not assume it has a complete packet until it gets a linefeed character.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.