Complete the algorithm for generating Sudoku puzzles - javascript

I have a code below but there's a problem with the current implementation in the function entriesToDel. It will not always produce n distinct co-ordinates to be turned blank in a puzzle. It should always produce an array with n elements consisting of two-element arrays storing the co-ordinates of an element and every element of the output should produce distinct co-ordinates but I'm not sure how to do that as I'm quite new to this.
// a function to randomly select n (row,column) entries of a 2d array
function entriesToDel(n) {
var array = [];
for (var i = 0; i < n; i++) {
var row = Math.round(3*Math.random());
var col = Math.round(3*Math.random());
array.push([row,col]);
}
return array;
}

A simple solution would be using a set instead of an array and add random numbers until you have enough unique entries.
In case you dont know what a set is: A set works similarly to an array, but it does not allow for duplicates (duplicates will be ignored). (It also does not give elements a fixed position; you can't access an entry of a set with set[i])
const size = 3 // define size of grid in variable
function entriesToDel(n) {
// create set of positions
const positions = new Set() // create set
while(positions.size < n) { // repeat until set has enough entries
positions.add(Math.floor((size ** 2) * Math.random())) // add random position (duplicates will be ignored by the set)
}
// convert set of positions to array of coordinates
const cords = []
for(const position of positions) { // iterate through positions
// convert position to column and row and add to array
const row = Math.floor(position / size)
const column = position % size
cords.push([row, column])
}
return cords
}
This solution is nice and simple. An issue is that if you're really unlucky, the random number generator will keep on generating the same number, resulting in a possibly very long calculation time. If you want to exclude this (very unlikely) possibility, it would be reasonably to use the more complex solution provided by James.

Related

MxGraph JavaScript: How to calculate edge length?

I am looking for a way to calculate the length of the edges in one graph.
As an example this image:
The connecting edge contains three parts, (a,b and c)
I have no idea how to retrieve this information that I can sum up the distances of a + b + c.
On a more complexe graph I want to calculate the length for each edge. There I would loop through all models of the graph, check with .isEdge if it is an edge and then calculate the length of each edge.
let cells = graph.getModel().cells;
for(let key in cells){
let mxCell = cells[key];
if(mxCell.isEdge())
{
calcLength(mxCell)
} }
calcLength() is the function I need. This one should return the length of the edge.
I used the helloPorts example from jGraph.
Thanks in advance!!
Back again,
together with a friend we found the solution. The information is stored under the graphView in the state as the length.
graphView.getState(mxCell).length
This is part of the mxCell object. If you are looking for the Euclidian length of an edge, this is stored under terminalDistance.
graphView.getState(edge).terminalDistance
The code would look like this to access it:
let cells = graph.getModel().cells;
let graphView = graph.getView();
// The getModel needs to be triggered before the loop otherwise the the mxCell state is undefined
graph.getModel().endUpdate();
for(let key in cells){
let mxCell = cells[key];
if(mxCell.isEdge())
{
let state = graphView.getState(mxCell).length;
}
}

JS - How to check if 2 images (their hash) are similar

GOAL
Finding a good way to check if 2 image are similar compairing their hash profiles. The hash is a simple array containing 0 and 1 values.
INTRO
I have 2 images. They are the same image but with some little differences: one has a different brightness, rotation and shot.
What I want to do is create a Javascript method to compare the 2 images and calculate a percentage value that tells how much they are similar.
WHAT I'VE DONE
After uploading the 2 images into a html5 canvas to get their image data, I've used the pHash algorithm (www.phash.org) to obtain their hash rapresentation.
The hash is an array containing 0 and 1 values that recreates the image in a "simplified" form.
I've also created a JS script that generates a html table with black cells where the array contains 1.The result is the following screenshot (the image is a Van Gogh picture):
Screenshot
Now, what I should do is to compare the 2 arrays for obtaining a percentage value to know "how much" they are similar.
The most part of the hash Javascript algorithms I've found googling already have a compare algorithm: the hamming distance algorithm. It's very simple and fast, but not very precise. In fact, the hamming distance algorithm says that the 2 images in my screenshot have a 67% of similarity.
THE QUESTION
Starting with 2 simple arrays, with the same length, filled with 0 and 1 values: what could be a good algorithm to determine similarity more precisely?
NOTES
- Pure Javascript development, no third party plugins or framework.
- No need of a complex algorithm to find the right similarity when the 2 images are the same but they are very different (strong rotation, totaly different colors, etc.).
Thanx
PHASH CODE
// Size is the image size (for example 128px)
var pixels = [];
for (var i=0;i<imgData.data.length;i+=4){
var j = (i==0) ? 0 : i/4;
var y = Math.floor(j/size);
var x = j-(y*size);
var pixelPos = x + (y*size);
var r = imgData.data[i];
var g = imgData.data[i+1];
var b = imgData.data[i+2];
var gs = Math.floor((r*0.299)+(g*0.587)+(b*0.114));
pixels[pixelPos] = gs;
}
var avg = Math.floor( array_sum(pixels) / pixels.length );
var hash = [];
array.forEach(pixels, function(px,i){
if(px > avg){
hash[i] = 1;
} else{
hash[i] = 0;
}
});
return hash;
HAMMING DISTANCE CODE
// hash1 and hash2 are the arrays of the "coded" images.
var similarity = hash1.length;
array.forEach(hash1, function(val,key){
if(hash1[key] != hash2[key]){
similarity--;
}
});
var percentage = (similarity/hash1.length*100).toFixed(2);
NOTE: array.forEach is not pure javascript. Consider it as a replace of: for (var i = 0; i < array.length; i++).
I'm using blockhash, it seems pretty good so far, only false positives I get are when half the pictures are of the same background color, which is to be expected =/
http://blockhash.io/
BlockHash may be slower than yours but it should be more accurate.
What you do is just calculate the greyscale of EACH pixels, and just compare it to the average to create your hash.
What BlockHash does is split the picture in small rectangles of equal size and averages the sum of the RGB values of the pixels inside them and compares them to 4 horizontal medians.
So it is normal that it takes longer, but it is still pretty efficient and accurate.
I'm doing it with pictures of a good resolution, at minimum 1000x800, and use 16bits. This gives a 64 character long hexadecimal hash. When using the hamming distance provided by the same library, I see good results when using a 10 similarity threshold.
Your idea of using greyscale isn't bad at all. But you should average out portions of the image instead of comparing each pixels. That way you can compare a thumbnail version to its original, and get pretty much the same phash!
I don't know if this can do the trick, but you can just compare the 0 and 1 similarities between arrays :
const arr1 = [1,1,1,1,1,1,1,1,1,1],
arr2 = [0,0,0,0,0,0,0,0,0,0],
arr3 = [0,1,0,1,0,1,0,1,0,1],
arr4 = [1,1,1,0,1,1,1,0,1,1]
const howSimilar = (a1,a2) => {
let similarity = 0
a1.forEach( (elem,index) => {
if(a2[index]==elem) similarity++
})
let percentage = parseInt(similarity/arr1.length*100) + "%"
console.log(percentage)
}
howSimilar(arr1,arr2) // 0%
howSimilar(arr1,arr3) // 50%
howSimilar(arr1,arr4) // 80%

Javascript error with if statement inside of a for loop/array insertion and construction

Backstory and Context: I will do my best to explain what I am working on, where the problem is that I have identified, and my thought process for my current code. Bear with me as I will try to provide as much detail as possible. Note that I am still learning Quartz Composer and am rather unfamiliar with Javascript syntax.
I am working with Quartz Composer to modify the visual equalizer template that comes with the program. Essentially, my audio feed gets input into the program where it is processed as 16 audio frequency bands that are stored in an array. Each of these 16 bands corresponds to the magnitude of the frequency in that range, and will ultimately display on a "bar graph" with the various equalizer levels. This part is all well and good.
My task requires that I have more than 16 bars in my graph. Since I cannot split the audio any further without turning to a somewhat complicated external process, I figured I could just fake my way through by inserting fake bar and values in between the actual audio bars that would average or evenly space themselves in between the true audio bar values.
For example, suppose that my input array looked as such "A B C" where "A," "B," and "C" are some audio value. I declare in my code a certain "insertion" value that determines how many fake bars are to be added in between each of my true audio values. In the case of "A B C," suppose that the insertion value were set to 0. No bars are to be inserted, thus the output array is "A B C." Suppose that the insertion value is set to 1. 1 bar is to be inserted in between each of the true values, returning an array of "A (A+B)/2 B (B+C)/2 C." As such, if the insertion value is set to 2, two values will be placed between A and B such that they are evenly spaced in between, aka 1/3 and 2/3rds of the way between A and B respectively.
Relevant Code:
var array = new Array();
function (__structure outputStructure) main (__structure inputStructure, __number insertions, __number time) {
var result = new Object();
/* check if the input array for the sound bands is null or not */
if (inputStructure != null) {
/* keep track of the number of times that a value is inserted so that, upon the counter being reset, a value from the original array can be inserted */
var counter = 0;
/* keep track of which index location the original array value is to be pulled from */
var inputPlace = 0;
/* build a new array that inserts a number of values equal to the value of the insertions variable in between all values of the original array */
for (i=0; i<((insertions + 1) * (inputStructure.length - 1) + 1); ++i) {
/* if it is time to do so, pull a true audio bar value from the original string into the new string */
if (counter = 0) {
array[i] = inputStructure[inputPlace];
/* if it is not time to pull a value from the original array, insert the specified number of evenly spaced, calculated bar values into the new array */
} else {
/* space the insertion between the nearest two true values from the original input array */
array[i] = inputStructure[inputPlace] + ((inputStructure[inputPlace + 1] - inputStructure[inputPlace]) * counter / (insertions + 1));
}
counter = counter + 1;
/* reset the counter after each complete iteration of insertions is compelte so that an original array value can be pulled into the new array */
if (counter > insertions) {
counter = 0;
inputPlace = inputPlace + 1;
}
counter = counter + 1;
}
}
result.outputStructure = array;
return result;
}
In this particular code, I do not have any inputs manually declared, as they will all be pulled in from Quartz Composer. I have had no problems with this part. I assume for testing purposes, you could just hard code in values for the inputStructure, insertion, and time.
Problem:
The problem that I have identified seems to be with the if statement inside of my for loop. When I run the code, it sets every value in the new array to 0.
Question: I guess my question is what am I doing wrong with the if statement in the for loop that is causing it to overwrite all of my other iteration values? I was able to get the counter working such that each iteration would return the counter value "0 1 2 0 1 2" in the case of the insertion being set to 2, but I was unable to say okay, now check during each iteration, and if the counter is 0, pull a value from the original array. Otherwise, do something else, in this case, calculate and add the fake bar value into the new array.
Your problem is this line:
if (counter = 0) {
It will always return false, to compare you need to use == or ===

Memory-efficient downsampling (charting) of a growing array

A node process of mine receives a sample point every half a second, and I want to update the history chart of all the sample points I receive.
The chart should be an array which contains the downsampled history of all points from 0 to the current point.
In other words, the maximum length of the array should be l. If I received more sample points than l, I want the chart array to be a downsampled-to-l version of the whole history.
To express it with code:
const CHART_LENGTH = 2048
createChart(CHART_LENGTH)
onReceivePoint = function(p) {
// p can be considered a number
const chart = addPointToChart(p)
// chart is an array representing all the samples received, from 0 to now
console.assert(chart.length <= CHART_LENGTH)
}
I already have a working downsampling function with number arrays:
function downsample (arr, density) {
let i, j, p, _i, _len
const downsampled = []
for (i = _i = 0, _len = arr.length; _i < _len; i = ++_i) {
p = arr[i]
j = ~~(i / arr.length * density)
if (downsampled[j] == null) downsampled[j] = 0
downsampled[j] += Math.abs(arr[i] * density / arr.length)
}
return downsampled
}
One trivial way of doing this would obviously be saving all the points I receive into an array, and apply the downsample function whenever the array grows. This would work, but, since this piece of code would run in a server, possibly for months and months in a row, it would eventually make the supporting array grow so much that the process would go out of memory.
The question is: Is there a way to construct the chart array re-using the previous contents of the chart itself, to avoid mantaining a growing data structure? In other words, is there a constant memory complexity solution to this problem?
Please note that the chart must contain the whole history since sample point #0 at any moment, so charting the last n points would not be acceptable.
The only operation that does not distort the data and that can be used several times is aggregation of an integer number of adjacent samples. You probably want 2.
More specifically: If you find that adding a new sample will exceed the array bounds, do the following: Start at the beginning of the array and average two subsequent samples. This will reduce the array size by 2 and you have space to add new samples. Doing so, you should keep track of the current cluster size c(the amount of samples that constitute one entry in the array). You start with one. Every reduction multiplies the cluster size by two.
Now the problem is that you cannot add new samples directly to the array any more because they have a completely different scale. Instead, you should average the next c samples to a new entry. It turns out that it is sufficient to store the number of samples n in the current cluster to do this. So if you add a new sample s, you would do the following.
n++
if n = 1
append s to array
else
//update the average
last array element += (s - last array element) / n
if n = c
n = 0 //start a new cluster
So the memory that you actually need is the following:
the history array with predefined length
the number of elements in the history array
the current cluster size c
the number of elements in the current cluster n
The size of the additional memory does not depend on the total number of samples, hence O(1).

Efficiently ordering line segments into a loop

I'm using a library (JavaScript-Voronoi) which produces an array of line segments that represent a closed polygon. These segments appear unordered, both the order in which the segments appear as well as the ordering of the points for each end of the segment.
(Edit: As noted in a comment below, I was wrong: the segments from the library are well-ordered. However, the question stands as written: let's assume that the segments do not have any ordering, as this makes it more generally useful.)
For example:
var p1 = {x:13.6,y:13.1}, p2 = {x:37.2,y:35.8}, p3 = {x:99.9,y:14.6},
p4 = {x:99.9,y:45.5}, p5 = {x:33.7,y:66.7};
var segments = [
{ va:p1, vb:p2 },
{ va:p3, vb:p4 },
{ va:p5, vb:p4 },
{ va:p3, vb:p2 },
{ va:p1, vb:p5 } ];
Notice how the first segment links to the last (they share a common point), and to the next-to-last. It is guaranteed that every segment shares an end with exactly one other segment.
I would like to convert this into a list of points to generate a proper SVG polygon:
console.log( orderedPoints(segments) );
// [
// {"x":33.7,"y":66.7},
// {"x":13.6,"y":13.1},
// {"x":37.2,"y":35.8},
// {"x":99.9,"y":14.6},
// {"x":99.9,"y":45.5}
// ]
It doesn't matter whether the points are in clockwise or counter-clockwise order.
The following code is what I've come up with, but in the worst-case scenario it will take n^2+n point comparisons. Is there a more efficient algorithm for joining all these together?
function orderedPoints(segs){
segs = segs.concat(); // make a mutable copy
var seg = segs.pop(), pts = [seg.va], link = seg.vb;
for (var ct=segs.length;ct--;){
for (var i=segs.length;i--;){
if (segs[i].va==link){
seg = segs.splice(i,1)[0]; pts.push(seg.va); link = seg.vb;
break;
}else if (segs[i].vb==link){
seg = segs.splice(i,1)[0]; pts.push(seg.vb); link = seg.va;
break;
}
}
}
return pts;
}
If your polygon is convex, you can pick middle point of each line segment, then use convex hull algorithm to find convex polygon by middle items, after that, because you know what is the arrangement of middles and also you know which middle belongs to which segment, you can find an arrangement in original array.
If you just want to find a convex hull, use convex hull algorithm directly, it's O(n log n), which is fast enough, but also you can find a Quickhull algorithm in javascript here. quickhull is also in O(n logn), but in average, the worst case is O(n^2), but it's fast because of less constant factor.
but in the case of general algorithm:
Set one end of each segment as First, and another end as second (randomly).
Sort your segments by their first x and put it in array First after that in array first sort segments with same first x by their first y and put two extra int into your structure to save start and end position of items with same first x.
Then again sort your segments with the second x value, .... and make array second.
Above actions both are in O(n log n).
Now pick first segment in array First, search for its second x value in both arrays First and second, in the case you find similar values, search for their y values in related subarray (you have start and end position of items with same x). You know there is only one segment with this order (also is not current segment), so finding next segment takes O(log n) and because in all there is n-1 next segment it takes O(n logn) (also preprocessing), which is extremely faster than O(n^2).
It should be possible to turn the points into a (double, unordered?) linked list in linear time:
for (var i=0; i<segments.length; i++) {
var a = segments[i].va,
b = segments[i].vb;
// nexts being the two adjacent points (in unknown order)
if (a.nexts) a.nexts.push(b); else a.nexts = [b];
if (b.nexts) b.nexts.push(a); else b.nexts = [a];
}
Now you can iterate it to build the array:
var prev = segments[0].va,
start = segments[0].vb, // start somewhere, in some direction
points = [],
cur = start;
do {
points.push(cur);
var nexts = cur.nexts,
next = nexts[0] == prev ? nexts[1] : nexts[0];
delete cur.nexts; // un-modify the object
prev = cur;
cur = next;
} while (cur && cur != start)
return points;
If you do not want to modify the objects, an EcmaScript6 Map (with object keys) would come in handy. As a workaround, you could use a JSON serialisation of your point coordinates as keys of a normal object, however you are then limited to polygons that do not contain a coordinate twice. Or just use the unique voronoiId property that your library adds to the vertices for identifying them.
For a convex polygon, you don't even need to know the side segments. You just need a bunch of vertices. The procedure to order the vertices is pretty simple.
average all the vertices together to get a point inside the polygon. note that this doesn't even need to be the centroid. it just needs to be a point inside the polygon. call this point C.
for each vertex V[i], compute the angle the line segment from V[i] to C forms with the line segment from V[i] to V[i]+(1,0). call this a[i].
sort the angles of vertices using the vertices as satellite data.
the sorted vertices are in order around the polygon. there are some redundancies that you can remove. 1 runs in linear time, 2 runs in linear time, 3 runs in n log n.

Categories

Resources