Ionic Angular JS distance shows twice then disappears (NaN) - javascript

I have been battling with this pretty much for the whole day.
In list1.ts:
inside the constructor I have the following code, which gets an array of items, and places them in this.items - after that it gets the users' current position and injects a distance variable (calculated from lat/lng from firebase and lat/lng from user) into this.items
this.categoryId = this.navParams.get('categoryId');
afoDatabase.list('/list', {query: {
orderByChild: "categoryId",
equalTo: parseInt(this.categoryId)
}}).subscribe(listItems => {
this.items = listItems;
this.geolocation.getCurrentPosition({timeout:15000}).then((resp) => {
this.myLat = resp.coords.latitude;
this.myLong = resp.coords.longitude;
}).catch((error) => {
console.log('Error getting location', error);
});
for (var i = 0, len = this.items.length; i < len; i++) {
this.distance = this.calculateDistance(this.myLat, this.myLong, this.items[i].lat, this.items[i].lng);
this.items[i]["distance"] = Math.round(this.distance);
console.log('testing myLat', this.myLat)
console.log('testing myLong', this.myLong)
}
console.log('testing inject', this.items)
loadingPopup.dismiss().catch(() => {});
});
In list1.html:
I show the distance using the following:
{{item.distance}}
This works fine, and the distance shows. However, if I go back a page (to the root page), and then go back to the page where I calculate the distance, I get the distance fine as well ... However, the 3rd time it just throws NaN
lat/lng shows values in console, and distance:4625 for first try
lat/lng shows values in console, and distance:4625 for second try
lat/lng shows undefined in console, and distance:NaN for the third try
If you have any idea, please let me know :)
Thanks.

I believe there are 2 problems here:
this.geolocation.getCurrentPosition({timeout:15000}).then((resp) => {
this.myLat = resp.coords.latitude;
this.myLong = resp.coords.longitude;
First this code is asynchronous, so it will be executed after your for loop, and so this.myLat and this.myLong will have the previously assigned values, that I believe will be from some initialisation or something else in your code.
Second, if you put a console.log(this.myLat, this.myLong) right after the last line of the code I have posted, I believe you will see that they are undefined. So, maybe coords doesn't have latitude / longitude properties? Try to console.log(resp.coords) to check.

Related

Function A depends on previous function to update the state, but Function A still tries to render before the update

An Example I have linked below, that shows the problem I have.
My Problem
I have these two functions
const updatedDoc = checkForHeadings(stoneCtx, documentCtx); // returns object
documentCtx.setUserDocument(updatedDoc); // uses object to update state
and
convertUserDocument(stoneCtx, documentCtx.userDocument);
// uses State for further usage
The Problem I have is, that convertUserDocument runs with an empty state and throws an error and then runs again with the updated state. Since it already throws an error, I cannot continue to work with it.
I have tried several different approaches.
What I tried
In the beginning my code looked like this
checkForHeadings(stoneCtx, documentCtx);
// updated the state witch each new key:value inside the function
convertUserDocument(stoneCtx, documentCtx.userDocument);
// then this function was run; Error
Then I tried the version I had above, to first put everything into an object and update the state only once.
HavingconvertUserDocument be a callback inside of checkForHeadings, but that ran it that many times a matching key was found.
My current try was to put the both functions in seperate useEffects, one for inital render and one for the next render.
const isFirstRender = useRef(true);
let init = 0;
useEffect(() => {
init++;
console.log('Initial Render Number ' + init);
console.log(documentCtx);
const updatedDoc = checkForHeadings(stoneCtx.stoneContext, documentCtx);
documentCtx.setUserDocument(updatedDoc);
console.log(updatedDoc);
console.log(documentCtx);
isFirstRender.current = false; // toggle flag after first render/mounting
console.log('Initial End Render Number ' + init);
}, []);
let update = 0;
useEffect(() => {
update++;
console.log('Update Render Number ' + update);
if (!isFirstRender.current) {
console.log('First Render has happened.');
convertUserDocument(stoneCtx.stoneContext, documentCtx.userDocument);
}
console.log('Update End Render Number ' + update);
}, [documentCtx]);
The interesting part with this was to see the difference between Codesandbox and my local development.
On Codesandbox Intial Render was called twice, but each time the counter didn't go up, it stayed at 1. On the other hand, on my local dev server, Initial Render was called only once.
On both version the second useEffect was called twice, but here also the counter didn't go up to 2, and stayed at 1.
Codesandbox:
Local Dev Server:
Short example of that:
let counter = 0;
useEffect(()=> {
counter++;
// this should only run once, but it does twice in the sandbox.
// but the counter is not going up to 2, but stays at 1
},[])
The same happens with the second useEffect, but on the second I get different results, but the counter stays at 1.
I was told this is due to a Stale Cloruse, but doesn't explain why the important bits don't work properly.
I got inspiration from here, to skip the initial render: https://stackoverflow.com/a/61612292/14103981
Code
Here is the Sandbox with the Problem displayed: https://codesandbox.io/s/nameless-wood-34ni5?file=/src/TextEditor.js
I have also create it on Stackblitz: https://react-v6wzqv.stackblitz.io
The error happens in this function:
function orderDocument(structure, doc, ordered) {
structure.forEach((el) => {
console.log(el.id);
console.log(doc);
// ordered.push(doc[el.id].headingHtml);
// if (el.children?.length) {
// orderDocument(el.children, doc, ordered);
// }
});
return ordered;
}
The commented out code throws the error. I am console.loggin el.id and doc, and in the console you can see, that doc is empty and thus cannot find doc[el.id].
Someone gave me this simple example to my problem, which sums it up pretty good.
useEffect(() => {
documentCtx.setUserDocument('ANYTHING');
console.log(documentCtx.userDocument);
});
The Console:
{}
ANYTHING
You can view it here: https://stackblitz.com/edit/react-f1hwky?file=src%2FTextEditor.js
I have come to a solution to my problem.
const isFirstRender = useRef(true);
useEffect(() => {
const updatedDoc = checkForHeadings(stoneCtx.stoneContext, documentCtx);
documentCtx.setUserDocument(updatedDoc);
}, []);
useEffect(() => {
if (!isFirstRender.current) {
convertUserDocument(stoneCtx.stoneContext, documentCtx.userDocument);
} else {
isFirstRender.current = false;
}
}, [documentCtx]);
Moving isFirstRender.current = false; to an else statement actually gives me the proper results I want.
Is this the best way of achieving it, or are there better ways?

How to "control" when Firestore.getAll(...promises) stops by reject

Context:
I 'm doing a cloud function to send pushes to multiple users. I need to recover the info of each user to know some data like, name, country..etc..
Problem:
Actually I recover the list of user Id's and when I got it, then I create an array of promisesto recover all the info:
var usersPromises = []
for (var i = 0; i < usersInRange.length; i++) {
usersPromises[i] = firestore.collection("users").doc(usersInRange[i])
}
Then I recover and send the push using firestore.getAll():
firestore.getAll(...usersPromises).then(results => {
for(snapshot in results){
if(snapshot.exists){
......
var user = snapshot.data()
......
}else{
......
}
}
})
This solution is actually working "fine" almost all the time. But at this moment the Firestore db has some users that do not exist or something is wrong, because the method getAll()stops before finishing all the promises. I know it because no push is sent, and in the console, just say that the method has finished.
Reading in SO and documentation, I saw, that getAll stops if some promise is "broken". (all or nothing)
And here is where I'm lost. How can I "force" or do in another way, to just "jump" this promises that can't be completed?
P.S:
I tried to do with a "for" but It seems to omit some promises:
for (var i = 0; i < usersPromises.length; i++) {
usersPromises[i]
.get()
.then(snapshot => {
if(snapshot.exists){
......
var user = snapshot.data()
......
}else{
......
}
})
}
I think its not a problem of getAll. I have tested like this:
const firestore = new Firestore();
let doc = []
doc[0] = firestore.doc('test/test');
doc[1] = firestore.doc('test/test1');
doc[2] = firestore.doc('test/doc');
firestore.getAll(...doc)
.then(result=> result.forEach(doc => console.log(doc._fieldsProto)))
.catch(err=>console.log(err));
In my database I have 'test/test' and 'test/doc' document, but I do not have 'test/test1' and results look like this:
So we just get undefined on document that is not exist and that's all. I suggest to add catch and see if there is any exception. When I have been writing the test the function was interrupted by typo mistake in inner function.
I hope this will help!

Push not working inside firebase listener

var db=firebase.firestore();
var musicidarray=[];
var musicpaircontentarray=[];
//Retreive all music value pairs
db.collection("MusicIdNamePairs").get().then((querySnapshot) => {
querySnapshot.forEach((doc) => {
musicidarray.push(doc.id);
musicpaircontentarray.push(doc);
//alert(doc.get("Name"));
//console.log(`${doc.id} => ${doc.data()}`);
});
});
alert(musicidarray.length);//Suprisingly outputs length as zero even when the previou loop has run
for(var i=0;i<musicpaircontentarray.length;i++)
{
alert(musicpaircontentarray[i].get("Name"));
}
Here the musicidarray and the musicpaircontentarray (storing the document reference obtained from Cloud Firestore) is showing length as zero even after it has executed the push operation inside the foreach loop in the previous block of code.What is wrong here.Please help me.Thanks a lot for the help.
<script>
var db=firebase.firestore();
var musicidarray=[];
var musicpaircontentarray=[];
//Retreive all music value pairs
db.collection("MusicIdNamePairs").get().then((querySnapshot) => {
querySnapshot.forEach((doc) => {
musicidarray.push(doc.id);
musicpaircontentarray.push(doc);
//alert(doc.get("Name"));
//console.log(`${doc.id} => ${doc.data()}`);
});
displayarray();
});
function displayarray()
{
alert(musicidarray.length);
for(var i=0;i<musicpaircontentarray.length;i++)
{
alert(musicpaircontentarray[i].get("Name"));
}
}
</script>
The issue here as far as my understanding is that the array.length is called even before the data is retrieved from the database(even though the content is after the loop in the script and looks as if the length is called after only the loop is executed).Instead, call the display array method after the entire loop is assured to be completed as in the solution.
Hope this is the right way. If I am wrong somewhere, please correct me.

How to delay a function like this? [duplicate]

Using the Google Geocoder v3, if I try to geocode 20 addresses, I get an OVER_QUERY_LIMIT unless I time them to be ~1 second apart, but then it takes 20 seconds before my markers are all placed.
Is there any other way to do it, other than storing the coordinates in advance?
No, there is not really any other way : if you have many locations and want to display them on a map, the best solution is to :
fetch the latitude+longitude, using the geocoder, when a location is created
store those in your database, alongside the address
and use those stored latitude+longitude when you want to display the map.
This is, of course, considering that you have a lot less creation/modification of locations than you have consultations of locations.
Yes, it means you'll have to do a bit more work when saving the locations -- but it also means :
You'll be able to search by geographical coordinates
i.e. "I want a list of points that are near where I'm now"
Displaying the map will be a lot faster
Even with more than 20 locations on it
Oh, and, also (last but not least) : this will work ;-)
You will less likely hit the limit of X geocoder calls in N seconds.
And you will less likely hit the limit of Y geocoder calls per day.
You actually do not have to wait a full second for each request. I found that if I wait 200 miliseconds between each request I am able to avoid the OVER_QUERY_LIMIT response and the user experience is passable. With this solution you can load 20 items in 4 seconds.
$(items).each(function(i, item){
setTimeout(function(){
geoLocate("my address", function(myLatlng){
...
});
}, 200 * i);
}
Unfortunately this is a restriction of the Google maps service.
I am currently working on an application using the geocoding feature, and I'm saving each unique address on a per-user basis. I generate the address information (city, street, state, etc) based on the information returned by Google maps, and then save the lat/long information in the database as well. This prevents you from having to re-code things, and gives you nicely formatted addresses.
Another reason you want to do this is because there is a daily limit on the number of addresses that can be geocoded from a particular IP address. You don't want your application to fail for a person for that reason.
I'm facing the same problem trying to geocode 140 addresses.
My workaround was adding usleep(100000) for each loop of next geocoding request. If status of the request is OVER_QUERY_LIMIT, the usleep is increased by 50000 and request is repeated, and so on.
And of cause all received data (lat/long) are stored in XML file not to run request every time the page is loading.
EDIT:
Forgot to say that this solution is in pure js, the only thing you need is a browser that supports promises https://developer.mozilla.org/it/docs/Web/JavaScript/Reference/Global_Objects/Promise
For those who still needs to accomplish such, I've written my own solution that combines promises with timeouts.
Code:
/*
class: Geolocalizer
- Handles location triangulation and calculations.
-- Returns various prototypes to fetch position from strings or coords or dragons or whatever.
*/
var Geolocalizer = function () {
this.queue = []; // queue handler..
this.resolved = [];
this.geolocalizer = new google.maps.Geocoder();
};
Geolocalizer.prototype = {
/*
#fn: Localize
#scope: resolve single or multiple queued requests.
#params: <array> needles
#returns: <deferred> object
*/
Localize: function ( needles ) {
var that = this;
// Enqueue the needles.
for ( var i = 0; i < needles.length; i++ ) {
this.queue.push(needles[i]);
}
// return a promise and resolve it after every element have been fetched (either with success or failure), then reset the queue.
return new Promise (
function (resolve, reject) {
that.resolveQueueElements().then(function(resolved){
resolve(resolved);
that.queue = [];
that.resolved = [];
});
}
);
},
/*
#fn: resolveQueueElements
#scope: resolve queue elements.
#returns: <deferred> object (promise)
*/
resolveQueueElements: function (callback) {
var that = this;
return new Promise(
function(resolve, reject) {
// Loop the queue and resolve each element.
// Prevent QUERY_LIMIT by delaying actions by one second.
(function loopWithDelay(such, queue, i){
console.log("Attempting the resolution of " +queue[i-1]);
setTimeout(function(){
such.find(queue[i-1], function(res){
such.resolved.push(res);
});
if (--i) {
loopWithDelay(such,queue,i);
}
}, 1000);
})(that, that.queue, that.queue.length);
// Check every second if the queue has been cleared.
var it = setInterval(function(){
if (that.queue.length == that.resolved.length) {
resolve(that.resolved);
clearInterval(it);
}
}, 1000);
}
);
},
/*
#fn: find
#scope: resolve an address from string
#params: <string> s, <fn> Callback
*/
find: function (s, callback) {
this.geolocalizer.geocode({
"address": s
}, function(res, status){
if (status == google.maps.GeocoderStatus.OK) {
var r = {
originalString: s,
lat: res[0].geometry.location.lat(),
lng: res[0].geometry.location.lng()
};
callback(r);
}
else {
callback(undefined);
console.log(status);
console.log("could not locate " + s);
}
});
}
};
Please note that it's just a part of a bigger library I wrote to handle google maps stuff, hence comments may be confusing.
Usage is quite simple, the approach, however, is slightly different: instead of looping and resolving one address at a time, you will need to pass an array of addresses to the class and it will handle the search by itself, returning a promise which, when resolved, returns an array containing all the resolved (and unresolved) address.
Example:
var myAmazingGeo = new Geolocalizer();
var locations = ["Italy","California","Dragons are thugs...","China","Georgia"];
myAmazingGeo.Localize(locations).then(function(res){
console.log(res);
});
Console output:
Attempting the resolution of Georgia
Attempting the resolution of China
Attempting the resolution of Dragons are thugs...
Attempting the resolution of California
ZERO_RESULTS
could not locate Dragons are thugs...
Attempting the resolution of Italy
Object returned:
The whole magic happens here:
(function loopWithDelay(such, queue, i){
console.log("Attempting the resolution of " +queue[i-1]);
setTimeout(function(){
such.find(queue[i-1], function(res){
such.resolved.push(res);
});
if (--i) {
loopWithDelay(such,queue,i);
}
}, 750);
})(that, that.queue, that.queue.length);
Basically, it loops every item with a delay of 750 milliseconds between each of them, hence every 750 milliseconds an address is controlled.
I've made some further testings and I've found out that even at 700 milliseconds I was sometimes getting the QUERY_LIMIT error, while with 750 I haven't had any issue at all.
In any case, feel free to edit the 750 above if you feel you are safe by handling a lower delay.
Hope this helps someone in the near future ;)
I have just tested Google Geocoder and got the same problem as you have.
I noticed I only get the OVER_QUERY_LIMIT status once every 12 requests
So I wait for 1 second (that's the minimum delay to wait)
It slows down the application but less than waiting 1 second every request
info = getInfos(getLatLng(code)); //In here I call Google API
record(code, info);
generated++;
if(generated%interval == 0) {
holdOn(delay); // Every x requests, I sleep for 1 second
}
With the basic holdOn method :
private void holdOn(long delay) {
try {
Thread.sleep(delay);
} catch (InterruptedException ex) {
// ignore
}
}
Hope it helps
This worked well for me, after intermittent trial and error over the past couple days. I am using react instant-search-hooks via Algolia with Nextjs and Sanity for a new jobs site for a large company.
Postal Code is a facet for filtering/sorting/query matching that is defined in the algolia index. In another script file, I map out all of these facets (postal code, city, etc); Now that I have 100 returned files they can be mapped out by iterating through a mapped asynchronous import and the lat/lng coords matched to the corresponding zip codes defining a job posting (there are ~2500 postings but only ~100 zip codes to narrow down the coordinates of)
import * as dotenv from "dotenv";
dotenv.config();
import {
googleNetwork,
axiosConfig as googleAxiosConfig
} from "../utils/google-axios";
import JSONData from "../../public/data/postalCode/2022/05/26.json";
import fs from "fs";
import { join } from "path";
import type { GeneratedGeolocData } from "../types/algolia";
import { timezoneHelper } from "../utils/timezone-helper";
import { Unenumerate } from "../types/helpers";
let i = 0;
i < JSONData.postalCodes.facetHits.length;
i++;
const getGeoCode = (
record: Unenumerate<typeof JSONData.postalCodes.facetHits>
) =>
function () {
return JSONData.postalCodes.facetHits.map(async (data = record, u) => {
const googleBase = process.env.NEXT_PUBLIC_GOOGLE_MAPS_BASE_PATH ?? "";
const googleApiKey =
process.env.NEXT_PUBLIC_TAKEDA_JOBS_GOOGLE_SERVICES ?? "";
const params: (string | undefined)[][] = [
["address", data.value],
["key", googleApiKey]
];
const query = params
.reduce<string[]>((arr, [k, v]) => {
if (v) arr.push(`${k}=${encodeURIComponent(v)}`);
return arr;
}, [])
.join("&");
return await googleNetwork("GET")
.get(`${googleBase}geocode/json?${query}`, googleAxiosConfig)
.then(dat => {
const geoloc = dat.data as GeneratedGeolocData;
const {
[0]: Year,
[2]: Month,
[4]: Day
} = new Date(Date.now())
.toISOString()
.split(/(T)/)[0]
.split(/([-])/g);
const localizedTimestamp = timezoneHelper({
dateField: new Date(Date.now()),
timezone: "America/Chicago"
});
return setTimeout(
() =>
fs.appendFileSync(
join(
process.cwd(),
`public/data/geoloc/${Year}/${Month}/${Day}-${[i]}.json`
),
JSON.stringify(
{
generated: localizedTimestamp,
_geoloc: {
postalCode: data.value,
geolocation: geoloc
}
},
null,
2
)
),
1000
);
});
});
};
getGeoCode(JSONData.postalCodes.facetHits[i]);
It took a lot less time than anticipated -- under 4 seconds for 100 unique results to generate
Context on the Unenumerate type -- Unenumerate strips the internal repeating unit within an array:
type Unenumerate<T> = T extends Array<infer U> ? U : T;

Java/Firebase Script Executing Multiple Times

I am having an interesting issue. The general idea of what I am doing is pulling data from a Firebase database, and populating a table based on that data. Everything runs perfectly during initial population--cells and rows are populated as they should be, but the weird issue is that the scripts seem to execute again randomly. I've logged the incoming data to the console, and can see it print twice after some amount of time.
This second execution does not happen if I am to navigate between pages, or reload the page--in either of those cases everything works as it should. The problem SEEMS to happen when I log back into my computer after locking it??? Does anybody have ANY idea what could be going on here? Relevant portion of script below:
const table = document.getElementById('myTable');
firebase.auth().onAuthStateChanged(firebaseUser => {
if (firebaseUser) {
let user = firebase.auth().currentUser;
let uid = user.uid;
const dbRef = firebase.database().ref().child("data/" + uid);
dbRef.once('value', snap => {
var dataCount = snap.child("secondData").numChildren();
var datalist = snap.child("secondData").val();
var dataArray = Object.keys(datalist).map(function(k) {
return datalist[k]
});
pullAllInfo(dataCount, dataArray);
});
}
});
function pullAllInfo(count, array) {
let k = 0;
let dataArray = [];
for (i = 0; i < count; i++) {
let specificRef = firebase.database().ref().child("secondData/" + array[i]);
specificRef.once('value', snap => {
var optionsTag = array[k];
k++;
var dataId = snap.child("id").val();
var dataName = snap.child("name").val();
var dataCount = snap.child("data").numChildren();
dataArray.push(dataId, dataName, dataCount, optionsTag);
if (k == count) {
buildTable(dataArray);
console.log(dataArray);
}
});
}
}
As you can see from the code above I AM calling .once() for each reference, which would prevent data duplication from the typical .on() call. Just cant seem to figure this one out. ALSO I have an iMac, just for anyone curious about my potential computer unlock diagnosis.
Thanks all!
Most likely, the auth state is changing and setting off your function. Try throwing a log under firebase.auth().onAuthStateChanged like this:
firebase.auth().onAuthStateChanged(firebaseUser => {
console.log( 'auth state changed', firebaseUser );
if (firebaseUser) {
My guess is that you'll see that the AuthState is changing when you log out/log in from your computer.
I solved this issue by creating another global boolean called preLoaded. At the beginning, it is set to false and, once the data is loaded and passed off to build the table, it is set to true. It now looks like this:
if(k == count && preloaded == false){
preloaded = true;
console.log(dataArray);
buildTable(dataArray);
}
All set!

Categories

Resources