1

I am working build an OTT platform but facing issue on Uploading large file to server. I have tried doing it with multer to store the file in temp folder and use aws-sdk s3.upload. It works fine with small file size, But if I tries to Upload Large file, it return

Network Error or Error 413 request entity too large

Following Error 413 - I have changed nginx.config (client_max_body_size 0;)

// 0 is for unlimited

but still no change. I have also tried doing it with multer-s3 but still no success. Later I tried doing it with busboy but still I am facing same issue. Here I am attaching my code where I am using busboy In ReactJs I am using Axios Please help

server.js

const express = require('express');
const bodyParser = require('body-parser');
const mongoose = require('mongoose');
const passport = require('passport');
const helmet = require('helmet');
const path = require('path');
const morgan = require('morgan');
const cors = require('cors');
const dotenv = require('dotenv');
// var admin = require('firebase-admin');
const rateLimit = require('express-rate-limit');
const busboy = require('connect-busboy');




const { setCloudinary } = require('./middleware/cloudinary');
// initalizing app
const app = express();

app.use(cors());
// app.use(helmet());
app.use(
  busboy({
    highWaterMark: 10 * 1024 * 1024, // Set 10 MiB buffer
  })
); // Insert the busboy middle-ware

// for environment files
if (process.env.NODE_ENV === 'production') {
  dotenv.config({ path: './env/.env.production' });
} else {
  dotenv.config({ path: './env/.env' });
}

const PORT = process.env.PORT || 5000;
const mongoDbUrl = process.env.mongoDbUrl;

const profileRoute = require('./routes/profile');
const adminRoute = require('./routes/admin');
const planRoute = require('./routes/plan');
const videoRoute = require('./routes/video');

//connnecting mongoDB server
mongoose
  .connect(mongoDbUrl, {
    useNewUrlParser: true,
    useFindAndModify: false,
    useCreateIndex: true,
    useUnifiedTopology: true,
  })
  .then((result) => {
    if (result) {
      setCloudinary();
      //if all goes right then listing to the server
      // var server = https.createServer(options, app);
      // console.log(server);
      app.listen(PORT, (err) => {
        if (err) throw err;
        console.log(`server is running at ${PORT}`);
      });
    }
  })
  .catch((err) => {
    throw err;
  });

//logging logs
if (process.env.NODE_ENV === 'production') {
  app.use(morgan('tiny'));
} else {
  app.use(morgan('dev'));
  mongoose.set('debug', true);
}

//initiallizaing passport
app.use(passport.initialize());
// require('./utils/adminRole')(passport);
require('./utils/firebase');
require('./utils/gcm');

app.use(express.json());
app.use(
  express.urlencoded({
    extended: true,
  })
);

// API serving routes
app.use('/api/v1/profile', profileRoute);
app.use('/api/v1/admin', adminRoute);
app.use('/api/v1/plan', planRoute);
app.use('/api/v1/videos', videoRoute);
// FOR REACT JS APP
//if the app is in production then serve files also
// if (process.env.NODE_ENV === 'production' || process.env.NODE_ENV === 'test') {
app.use(express.static(path.join(__dirname, 'client', 'build')));
app.get('*', (req, res) => {
  res.sendFile(path.join(__dirname, 'client', 'build', 'index.html'));
});
// }

// task

require('./jobs/Jobs');

router.js

router.post('/add/video', (req, res) => {
  req.pipe(req.busboy); // Pipe it trough busboy

  req.busboy.on('file', (fieldname = 'video', file, filename) => {
    console.log(`Upload of '${filename}' started`);

    // Create a write stream of the new file
    const fstream = fs.createWriteStream(path.join('temp/', filename));
    // Pipe it trough
    file.pipe(fstream);

    // On finish of the upload
    fstream.on('close', () => {
      console.log(`Upload of '${filename}' finished`);

      const childProcess = fork('./processVideo.js', ['message']);
      childProcess.on('message', (msg) => res.send(msg));
      childProcess.send({ file: fstream, data: req.body });
    });
  });
});

In ReactJs I am using Axios

 const config = {
      headers: {
        'Content-Type': 'multipart/form-data',
      },
      onUploadProgress: function (progressEvent) {
        var percentCompleted = Math.round(
          (progressEvent.loaded * 100) / progressEvent.total
        );
        console.log(percentCompleted);
      },
    };
    const formData = new FormData();
    formData.append('video', selectedVideo);
    formData.append('title', title);
    formData.append('description', description);
    formData.append('movieCategory', movieCategory);
    formData.append('thumbnail', thumbnail);
    formData.append('price', price);
    formData.append('isPremium', isPremium);
    formData.append('quality', quality);
    formData.append('language', language);
    formData.append('releaseYear', releaseYear);
    formData.append('duration', duration);

    axios
      .post('/api/v1/admin/add/video', formData, config)
      .then((res) => {
        console.log(res);
        alert('File Upload success');
      })
      .catch((err) => {
        console.log(err);
        alert('File Upload Error');
      });

4 Answers 4

2

I suggest you to use presigned S3 uploads link. In that case the server is responsible to return presigned upload link only and, from client code, just upload a file directly to AWS S3.

Sign up to request clarification or add additional context in comments.

1 Comment

Thank you for your help but I was looking to upload it to the server then upload it to s3. Solved using Multithreaded file uploading.
1

Have you tried using S3 multi part uploads or potentially transfer accelerator ?

1 Comment

Thank you for your help but I was looking to upload it to the server then upload it to s3. Solved using Multithreaded file uploading.
0

I see you are using Express. You need to set the Express request limit size. The default value is 100kb.

app.use(express.json({ limit: '50mb' }));
app.use(express.urlencoded({ limit: '50mb', extended: true }));

Also, Multer also have default file size limit to 1mb, so try to change that too:

const video_upload = multer({ 
  storage: videoStorage,
  fileFilter: videoFilter,
  limits: {
     fieldSize: '50mb'
  }
});

1 Comment

Thank You, I was looking to upload file with about 3-4 GB+ Solved using Multithreaded file uploading.
0

I solved the issue using Multithreaded file uploading. You can read about it in the blog. Here

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.