0

I am getting data from the open weather map API. Currently the data is being retrieved synchronously which is slow. However, the function has to be synchronous as it is part of a library, but it can call an async function. How might I still make concurrent requests to increase performance? A solution that does not use reqwests works, but reqwests is preferred.

fn get_combined_data(open_weather_map_api_url: String, open_weather_map_api_key: String,
                           coordinates: Vec<String>, metric: bool) -> Vec<HashMap<String, String>> {
    let urls: Vec<String> = get_urls(open_weather_map_api_url, open_weather_map_api_key,
                      coordinates.get(0).expect("Improper coordinates").to_string() + "," +
                          coordinates.get(1).expect("Improper coordinates"), metric);
    let mut data: Vec<HashMap<String, String>> = Vec::new();
    for url in urls {
        let request = reqwest::blocking::get(url).expect("Url Get failed").json().expect("json expected");
        data.push(request);
    }
    return data;
}

3 Answers 3

1

If your program isn't already async, probably the easiest way might be to use rayon.

use reqwest;
use std::collections::HashMap;
use rayon::prelude::*;

fn get_combined_data(open_weather_map_api_url: String, open_weather_map_api_key: String,
                           coordinates: Vec<String>, metric: bool) -> Vec<HashMap<String, String>> {

    let urls: Vec<String> = get_urls(open_weather_map_api_url, open_weather_map_api_key,
                      coordinates.get(0).expect("Improper coordinates").to_string() + "," +
                          coordinates.get(1).expect("Improper coordinates"), metric);
    
    let data : Vec<_>= urls
        .par_iter()
        .map(|&url| reqwest::blocking::get(url).expect("Url Get failed").json().expect("json expected"))
        .collect();
    
    return data;
}
Sign up to request clarification or add additional context in comments.

Comments

1

The easiest is probably to use tokios new_current_thread runtime and blocking on the data retreival.

use std::collections::HashMap;
use tokio::runtime;
pub fn collect_data() -> Vec<HashMap<String, String>> {
    let rt = runtime::Builder::new_current_thread()
        .build()
        .expect("couldn't start runtime");
    let urls = vec!["https://example.com/a", "https://example.com/b"];
    rt.block_on(async move {
        let mut data = vec![];
        for url in urls {
            data.push(async move {
                reqwest::get(url)
                    .await
                    .expect("Url Get Failed")
                    .json()
                    .await
                    .expect("json expected")
            });
        }
        futures::future::join_all(data).await
    })
}

Comments

1

You need an asynchronous runtime in order to call asynchronous functions. The easiest way to get one is to use the #[tokio::main] attribute (which despite the name can be applied to any function):

#[tokio::main]
fn get_combined_data(
    open_weather_map_api_url: String,
    open_weather_map_api_key: String,
    coordinates: Vec<String>,
    metric: bool,
) -> Vec<HashMap<String, String>> {
    let urls: Vec<String> = get_urls(
        open_weather_map_api_url,
        open_weather_map_api_key,
        coordinates
            .get(0)
            .expect("Improper coordinates")
            .to_string()
            + ","
            + coordinates.get(1).expect("Improper coordinates"),
        metric,
    );
    futures::future::join_all (urls.map (|u| {
        async move {
                reqwest::get(url)
                    .await
                    .expect("Url Get Failed")
                    .json()
                    .await
                    .expect("json expected")
            }
    })).await
}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.