Skip to content
Browse files
[ept] Fix point cloud loading hang/crash (fixes #42696)
The fixed buffer of 34 bytes may not be large enough. There could be some
extra fields in the input LAZ files. For example, the failing dataset
I got hold of was having point length of 80 bytes due to a couple
of extra fields. Checked with laz-perf devs and this should be the right
approach to get appropriately sized buffer to read points.
  • Loading branch information
wonder-sk authored and nyalldawson committed Jun 14, 2021
1 parent 94e9b33 commit e918d6e6693044e0270514139f387f20757d61b0
Showing with 3 additions and 1 deletion.
  1. +3 −1 src/core/pointcloud/qgseptdecoder.cpp
@@ -287,7 +287,9 @@ QgsPointCloudBlock *__decompressLaz( FileType &file, const QgsPointCloudAttribut
const size_t count = f.get_header().point_count;
QgsVector3D scale( f.get_header().scale.x, f.get_header().scale.y, f.get_header().scale.z );
QgsVector3D offset( f.get_header().offset.x, f.get_header().offset.y, f.get_header().offset.z );
char buf[sizeof( laszip::formats::las::point10 ) + sizeof( laszip::formats::las::gpstime ) + sizeof( laszip::formats::las::rgb ) ]; // a buffer large enough to hold our point

QByteArray bufArray( f.get_header().point_record_length, 0 );
char *buf =;

const size_t requestedPointRecordSize = requestedAttributes.pointRecordSize();
QByteArray data;

0 comments on commit e918d6e

Please sign in to comment.