Skip to content

Heatmap error checking and documentation for data larger than Bitmap supports #772

@Matthias-C

Description

@Matthias-C

Found this on rendering big datasets in to a heat map.
From my point of view it looks like that the internal plotting runs into an variable limit (i guess singed short (2^15)) and than stretches the last row of data al the way to the end as seen in pictures,

OK Render less than 2^15 rows in y dimension

outOK1
outOK2

Faild renders at more than 2^15 rows

outFail1
outFail2

[Test]
[TestCase(3768,"outOK1.png")]
[TestCase(32768,"outOK2.png")]
[TestCase(34000,"outFail1.png")]
[TestCase(60000,"outFail2.png")]
        public void ScottplotBoundary( int traceCount ,string path) {
            Random r           = new Random();
            int    sampleCount = 500;
            
            var zData = new double[traceCount, sampleCount];
            for (var i = 0; i < traceCount; i++) {
                for (var i1 = 0; i1 < sampleCount; i1++) {
                    zData[i, i1] = r.NextDouble() ; 
                }
            }
            
            var                plt = new Plot();
            CoordinatedHeatmap hmc = plt.AddHeatMapCoordinated(zData);
            plt.AxisAuto(0, 0);
            plt.SaveFig(path);
        }

found on nuget 4-1.6-beta -- running on core 3.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    BUGunexpected behavior

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions