I have the following 2 functions to check if a tutorial is shown to the user. Through the ajax call, I check the database and return 1 if user has seen the tutorial before or 0 if not. This is then translated to boolean true or false by the JS function. Trouble is when the user has seen the tour I get true, when he hasn't I get 0 where I expected false. Not sure what I'm doing wrong here.
async function startTour(name){
const tourShown = await checkTutorial(name);
console.log(tourShown); // returns true or 0. Should return true or false;
if(!tourShown){
console.log('starting tour');
}
}
function checkTutorial(name){
if(localStorage[name])
return true;
return $.ajax({
url: 'inc/tutorials',
type: 'POST',
data: {
action: 'check',
tutorial: name
},
success: function(data){
console.log('check results '+data); //I receive 1 or 0 as expected
if(data == 1){
localStorage[name] = 1;
return true;
} else {
console.log('returning false'); //does execute in case data is 0
return false;
}
}
});
}
startTour('some_tour');
trueis due toif(localStorage[name]) return true;, Your0is due toreturn $.ajax({which does NOT return one of the two returns in the success callback... But the jqXHR. So that zero may be because the request still isn't done yet... Because you will have it returned BEFORE the success callback is executed.successwhen using a promise.