0

Can we write function on firebase which will trigger every hour and parse some given website's page into xml and insert that data into firebase database? If it's possible to do how to make that(some a little help will be very helpful for me)?

Thanks in advance!

2
  • That all sounds possible. But right now your question is too broad to answer on Stack Overflow. Since there are three steps in your problem (reading data from a web site with Node.js, inserting data into the Firebase Database, running the Node.js code on Cloud Functons), I'd suggest starting with the first one and posting back with an MCVE when you get stuck. Commented Mar 27, 2018 at 13:13
  • Thank you Frank for give me direction. Commented Mar 27, 2018 at 13:29

2 Answers 2

2

yes you can do. Use cron to trigger the function. In the function you will have logic to get data from website and save it in database.

Sign up to request clarification or add additional context in comments.

2 Comments

Hello Chandrika! Can you write me some code of that function(by details)? Because l'm not well at java script and I couldn't find information about how parse in java script using firebase.
I haven't done the exact case. I have used external cron to call the service. https://cron-job.org/en/ and I had a front end where I upload spreadSheet and parse it, then save the data to firebase.
2

For other people finding this with a similar problem:

lgvalle posted a helpful gist how to scrape websites in cloud functions:

const rp = require('request-promise');
const cheerio = require('cheerio');

const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();

const db = admin.firestore();

exports.allyPallyFarmersMarket = functions.https.onRequest((request, response) => {
    const topic = "allyPallyFarmersMarket"
    const url = 'https://weareccfm.com/city-country-farmers-markets/market-profiles/alexandra-palace-market/'
    const options = {
        uri: url,
        headers: { 'User-Agent': 'test' },
        transform: (body) => cheerio.load(body)
    }    
    rp(options)
        .then(($) => {
            const scrap = $('strong').text()
            const [location, date, address] = scrap.split("–")

            //EDIT BY neogucky: 
            //Here you can access scrapped vars: location, date, address
        })
        .catch((err) => response.status(400).send(err))
});

https://gist.github.com/lgvalle/df2a0a7ee10266ca8056fa15654307d8

Add the needed dependencies, your package.json should look like this:

"dependencies": {
    "firebase-admin": "~6.0.0",
    "firebase-functions": "^2.0.3",
    "request-promise": "~4.2.2",
    "cheerio": "~0.22.0"
},

If you send JSON data {website: 'https://myurl.org'} in the request request you can access it with:

request.body.website

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.