Giter Club home page Giter Club logo

d3-contour's Introduction

D3: Data-Driven Documents

D3 (or D3.js) is a free, open-source JavaScript library for visualizing data. Its low-level approach built on web standards offers unparalleled flexibility in authoring dynamic, data-driven graphics. For more than a decade D3 has powered groundbreaking and award-winning visualizations, become a foundational building block of higher-level chart libraries, and fostered a vibrant community of data practitioners around the world.

Resources

d3-contour's People

Contributors

dongli avatar fil avatar jheer avatar mbostock avatar stof avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

d3-contour's Issues

Apply linear interpolation when incrementing the grid?

Currently we round data points to the nearest grid point before blurring. It would probably be better to apply linear interpolation to a 2×2 grid to the nearest grid points to the data point instead, something like this:

data.forEach(function(d, i, data) {
  var xi = (x(d, i, data) + o) * Math.pow(2, -k),
      yi = (y(d, i, data) + o) * Math.pow(2, -k);
  if (xi >= 0 && xi < n - 1 && yi >= 0 && yi < m - 1) {
    var x0 = Math.floor(xi),
        y0 = Math.floor(yi),
        xt = xi - x0,
        yt = yi - y0;
    values0[x0 + y0 * n] += (1 - xt) * (1 - yt);
    values0[x0 + 1 + y0 * n] += xt * (1 - yt);
    values0[x0 + 1 + (y0 + 1) * n] += xt * yt;
    values0[x0 + (y0 + 1) * n] += (1 - xt) * yt;
  }
});

Sometimes the lowest threshold is missing

E.g., if the domain is

[-149.76192742819748, 321.19300631539585]

and 14 ticks are desired, then the estimated tick step is 50

tickStep(-149.76192742819748, 321.19300631539585, 14)

and hence the resulting “niced” domain is

[-150, 250]

which when passed to ticks as

ticks(-150, 250, 14)

produces

[-140, -120, -100, -80, -60, -40, -20, 0, 20, 40, 60, 80, 100, 120, 140, 160, 180, 200, 220, 240]

and notice that –140 is greater than the original minimum. Related d3/d3-scale#209

We should be using d3.nice instead. d3/d3-array#174

ticks(...nice(-149.76192742819748, 321.19300631539585, 14), 14)

which produces

[-150, -100, -50, 0, 50, 100, 150, 200, 250, 300, 350]

although we should drop any threshold greater than or equal to the maximum.

Fix beveling on canvas edges.

image

I think we need to special-case the segments that are added on the canvas edges, rather than treating it as if the virtual pixels surrounding the canvas were always excluded from the contour.

Large example images are included in the release project

Hi, the build system for several systems, like Kibana and Wikimedia nodejs, packages the entire node_modules/ into a single file. I was inspecting my node_modules, and I noticed that d3-countour is 5.8MB, even though the code is only a few KB. I seriously doubt that the images, such as a 1.6MB animated volcano.gif, should be included, re-downloaded for every npm install, and published. Would it be possible to move it to a demo website instead? Thanks!!!

Handle Null values in grid

I have grids with null values. Countour does not like null or undefined in the array of values. Any solutions? I can set the nulls to gridMin-1 and then set the threshold but i still Get steps down from the Larger than min values to gridMin.

Don’t round the corners

There are some rounded corner artifacts on the edge of the grid:

Screenshot 2023-04-19 at 5 05 25 PM

It’s much more visible with a coarser grid:

Screenshot 2023-04-19 at 5 06 06 PM

I think it’s because we assume that outside the grid, the value is effectively -Infinity (or NaN). It would be nicer if we simply repeated the edge value infinitely (like WebGL CLAMP_TO_EDGE) so that we got square corners.

/cc @Fil

Tests cases are slightly misleading ?

In the documentation and in the blocks examples the size property of the contour generator created by d3.contours() is set such as [n, m] where n*m is equal to length of the input array used with the contour object.
In the test file the input array counts 110 values but the size used is [10, 10].

Is it voluntary ? Or I am missing something ?
Even if it won't change anything to the result, maybe it should be fixed (for clarity) either by removing the last line of the input array or either by using [10, 11] as input value for size ?

CLI?

It’d be nice to have a CLI, say taking GeoJSON points as input and producing GeoJSON polygons as output.

[feature]: cellSize(0.5) would work well in d3.contourDensity()

Hello,
I would like that cellSize(0.5) would work well in d3.contourDensity().
Because I'm not satisfied cellSize(1) when confirming the following angulated example:

const contourDensity = d3.contourDensity()
		    .x(d => d.x)
		    .y(d => d.y)
		    .size([width, height])
		    .cellSize(1)
		    .bandwidth(1)
		    .thresholds(10);

Contour polygon insideness is reverse.

Thanks for creating this great library! It saves me a lot of time!

I have applied it to create contours for precipitation data which is regional not global, and projected the contours onto orthogonal projection, but I found that the polygon insideness of some levels is reverse. In following figure, the left panel is wrong. Any idea why d3 draws this path with its insideness reverse? When not projected, there is no such problem.

default

...
var contours = d3Contour.contours().size([record.width, record.height])
                                   .smooth(false)
                                   .thresholds(record.colormap.domain)(record.rawData)
                                   // Change planar coordinates to spherical coordinates.
                                   .map(convertCoordinates)

svg.selectAll('path')
   .data(contours)
   .enter().append('path')
   .exit().remove()
svg.selectAll('path')
   .data(contours)
   .attr('d', path)
   .attr('transform', view.transform)
   .attr('fill', (d, i) => record.colormap.range[i])
   .attr('fill-rule', 'evenodd')
        .attr('value', d => d.value)

function convertCoordinates (d) {
  var e = Object.assign({}, d)
  e.coordinates = d.coordinates.map(polygon => {
    return polygon.map(ring => {
      return ring.map(point => {
        return [
          point[0] * record.grid.dlon + record.grid.startLon,
          point[1] * record.grid.dlat + record.grid.startLat
        ]
      })
    })
  })
  return e
}

Dependencies

Hi,

I am trying to use your library (with THREE.js).

I tried to run the example javascript code inside an HTML file with the line:

<script src="https://d3js.org/d3-contour.v1.min.js"></script>

At the line:
var contours = d3.contours()
.size([n, m])
.thresholds(d3.range(2, 21).map(p => Math.pow(2, p)))
(values);

I got the error message: "TypeError: d3.range is not a function".

I guess I need to link another library?

output data has index which is not exist in input data

the input list data which is long as 10000 only contains numbers as follow:

[
  -32768, 0, 1,
      13, 6, 7,
       8, 9
]

and the theshold is [0,1,2,3,4,5,6,7,8,9,10,11,12,13,14]
while the output polygons contains value 10/11/12.
the length of the output polygons which value is 10/11/12 is the same as the length of polygons which value is 9.
Is it a bug? how get the output without 10/11/12?

d3.contours

  1. Specify the extent (or size, [width, height]).
  2. Specify a value function that returns a numeric value for a given location [x, y].
  3. Allow an optional domain to be specified (see histogram.domain).
  4. Compute the function for each integer position [x, y] in the extent.
  5. Compute thresholds using the same logic as histogram.threshold.
  6. For each unassigned position remaining in the computed array, compute the contour threshold for the range of values corresponding to this position’s value. Repeat while any positions remain in the array.
  7. Optionally smooth the returned contours.

hole issue in contour

I got an issue that there are holes while drawing contours.
When i change size, it sometimes appears and sometimes disappears.

contours

Please advise if anyone can help me out.
Thank you.

Tile border

Hey !
Its more a question than an issue : I am using this excellent librairy to produce contour line over elevation data PNG tiles. However, I'd like to produce my contour lines tile-based, so i work one tile at a time. The librairy produces polygones that i transform to multiLineString afterwards. But the problem remains that the line run along the border of the tiles (wich is to be expected). I am having a hard time finding a way to reduce the output to lines going from border to border, but not framing the tile. Any idea about how to get about it ? Cheers !

Convenience function for computing grid from a continuous function?

Like this:

var x = d3.scaleLinear().domain([-2, 2]).range([0, width]);
var y = d3.scaleLinear().domain([-3, 1]).range([height, 0]);
var values = new Array(width * height);
for (var j = 0.5, k = 0; j < height; ++j) {
  for (var i = 0.5; i < width; ++i, ++k) {
    values[k] = value(x.invert(i), y.invert(j));
  }
}

Or maybe simpler, externalizing the scales:

var values = new Array(width * height);
for (var j = 0.5, k = 0; j < height; ++j) {
  for (var i = 0.5; i < width; ++i, ++k) {
    values[k] = value(i, j);
  }
}

The tricky part is to remember to evaluate the function at the midpoints of the pixels rather than their top-left corner.

Option to remove points along straight lines?

We currently emit one point per pixel in the grid, which leads to many extra coordinates along straight edges. For example this:

{
  "type": "MultiPolygon",
  "value": 0.5,
  "coordinates": [
    [
      [[6, 7.5], [6, 6.5], [6, 5.5], [6, 4.5], [6, 3.5], [5.5, 3], [4.5, 3],
       [3.5, 3], [3, 3.5], [3, 4.5], [3, 5.5], [3, 6.5], [3, 7.5], [3.5, 8],
       [4.5, 8], [5.5, 8], [6, 7.5]]
    ]
  ]
}

Could be reduce to this:

{
  "type": "MultiPolygon",
  "value": 0.5,
  "coordinates": [
    [
      [[6, 7.5], [6, 3.5], [5.5, 3], [3.5, 3], [3, 3.5], [3, 7.5], [3.5, 8], [5.5, 8], [6, 7.5]]
    ]
  ]
}

contour problems at antimeridian line

Starting from GeoTIFF Contours II (https://bl.ocks.org/mbostock/83c0be21dba7602ee14982b020b12f51), I have tested direct reading from netCDF files (the standard of all Climate Models from IPCC and all Earth System Modeling community). See https://github.com/PBrockmann/D3_netcdfjs and try netcdfjs_05.html

It sounds very promissing. So a great great thank for this d3.contour module !
In my dreams, I would also like to get a raster pixels rendering as in http://geoexamples.com/d3-raster-tools-docs/code_samples/raster-pixels-page.html but that is another story. Is there an official d3 to do that BTW ?

The problem with d3.countour in a geo context is that some contours at antemeridian line are shifted either up or down.
See:
image
The map get from netcdfjs_05.html is zoomable so the problem can clearly be seen.

I haven't been able to find where this shift comes from ?

Any help welcomed on this topic that could help a huge community.

Breaking change with contour-density thresholds starting from v3.0.2

Hi guys,
We are updating our dependency on d3-contour from 3.0.1 to 4.0.2.
But, it looks like it's not backward compatible starting from 3.0.2.
The breaking change was introduced in PR#57 and published as a patch version (3.0.2)

In our application, we provide thresholds as a callback function which returns an array:
contourDensity().thresholds((densityValues) => someArrayOfThresholds)

Could you suggest, how to make it compatible with the latest version (4.0.2) of d3-contour?
Maybe we need to use some built-in function like thresholdFreedmanDiaconis, thresholdScott, thresholdSturges to make the density chart drawn correctly again?

Epanechnikov kernel estimator

Is there any way to change kernel function of densityContour?

I would like to use Epanechnikov kernel, not the Gaussian kernel as documentation says. Any way to achieve this?

thanks.

Support curvilinear grids

I have an XY grid, with each cell address mapped to a specific GPS coordinate. After generating contours I see I get polygons with coordinates based on cell positions.

I guess for earth data where pixels are regular and bound by some projection system, I would be able to re-project back to map coordinates, but I can't easily do this with my irregular grid - I think I would need to interpolate every coordinate of the resulting polygon based on closest known real coordinates.

Would it be possible to update the API to allow for specifying coordinates along with a values array that the library then uses to calculate the resulting polygons? (I'm not sure if this makes sense as I don't know how the marching squares algorithm works)

If so, I would be happy to try contribute to this, although I would probably need some guidance!

Improve slow computations for large images

I'm using this library to discretely compute polygons of swept areas for trucks and semi-trailers in the frontend of a web application.

It works really great, but computing the contour polygons of large images (> 1Megapixels) takes lots of time. For my specific case, at around with around 3MP I need ~10seconds and with 7MP I have to wait around 30seconds to compute the contour of a single-threshold area.

Profiling the code showed that the Array.concat at

fragmentByStart[f.start] = fragmentByEnd[g.end] = {start: f.start, end: g.end, ring: f.ring.concat(g.ring)};
and at https://github.com/d3/d3-contour/blob/master/src/contours.js#L149 is causing lots of GC time and slowing things down by a great margin. This concat is constantly copying a large array just to prepend/append a small number of points.

Replacing this with a simple g.ring.push(...f.ring) improves the performance by 2 to 3 times. Using that push and a more fancy tree-building and flattening when the result is returned in the callback does even more to reduce the strain on the GC and speeds up my 7MP computation almost tenfold.

Though I'm not 100% sure if that change is correct or if the concat was required there for some edge case. Anyway, here is the change, are you interested in a PR?:

hoeck@5e50033

Options to disable nice thresholds

I would like to be able to generate a specific number of contours without using nice thresholds. this would be useful to avoid problems like having no contour lines if the number of thresholds is set too low. See vega/vega-geo#4

GeoJSON Geometry which is generated by d3.contour displays wrong

I generate geojson data using d3.contour, the data contains features of MultiPolygon. When I render it with fill layer, the small zoom it will merge, when I scale zoom, it will normal. Here is the example.

demo

I found GeoJSON data has some continue and same coordinates, but I am not sure if it raised the error.

Density contour being clipped?

I'm running into a peculiar problem when using contourDensity() to draw a rather basic density estimation around a set of points. It appears my contour lines are being "clipped"?

ex1

ex2

I closely followed this example, and my code does not differ by much. Here's my relevant code:

g.insert("g", "g")
    .attr("fill", "none")
    .attr("stroke", 'black')
    .attr("stroke-linejoin", "round")
    .selectAll("path")
    .data(contour.contourDensity()
        .x(function (d) { return d.fdx; })
        .y(function (d) { return d.fdy; })
        .bandwidth(80)
        .thresholds(3)
        (selectedNodesData))
    .enter().append("path")
    .attr("d", d3.geoPath())
    .attr("fill", function (d) { return '#eeeeee' })

Changing the .bandwidth() parameter makes the contours larger (extending beyond the above clip), but the new contours still clip at other positions.

Any suggestions welcomed, thank you!

Release d3-array v2 compatible version?

It would be wonderful to have a version of d3-contour that works with d3-array v2 to avoid multiple version issues in downstream projects. It looks like the package.json file would need to be updated to support multiple major versions. Thanks!

Is it possible to add a GeoTransform?

I made a similar (worse) library some months ago: https://github.com/rveciana/raster-marching-squares
I don't find how to give d3-contour the geographical positions for the pixels, so it's possible to project the output isolines.
In GDAL they call it GeoTransform, which is simply the coordinates of the first pixel and the delta for each position in the x and y directions.
If a geotransform could be set, the output coordinates would be in lon/lat and a projected geo-path could be used.
An example using my old library and reading the data from a GeoTiff: http://bl.ocks.org/rveciana/bef48021e38a77a520109d2088bff9eb

Computing contours from discrete points.

Like here:

image

A quick guess on how you could do it: create a coarse grid, bin the points into the grid, blur the grid, and then compute the contours. The grid size and the blur radius should presumably be configurable.

smoothing removes line segements

When smoothing flag is true, contours removes the line segments and only leaves closed polygons. Is it possible to define he smoothing algorithm?

Convenience function for computing a single contour?

A convenience function for computing a single contour. Like this, but without clobbering contours.thresholds:

contours.contour = function(values, threshold) {
  return contours.thresholds([threshold])(values)[0];
};

Weighted points in density

It would be great to add an option for entering weighted points in the density function. I think it would just be a matter of having a weight function that passes in the weights for each data point and adjusting line 35 accordingly:

    values0[xi + yi * n] = values0[xi + yi * n] + weight(d, i, data);

default could be 1.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.