admin 管理员组文章数量: 1086019
I have two 16bit integer raw data.
For example:
High Word = 17142 (dec) or 0100001011110110 (binary)
Low Word = 59759 (dec) or 1110100101111001 (binary)
If you treat two word together as one 32bit float data, it will be "123.456"
Binary --> 01000010111101101110100101111001
How to convert integer array [59759 , 17142] to float 123.456 in Javascript?
Note: [ X (16bit low word) , Y (16 bit high word) ] ==> Z (32bit float)
I have two 16bit integer raw data.
For example:
High Word = 17142 (dec) or 0100001011110110 (binary)
Low Word = 59759 (dec) or 1110100101111001 (binary)
If you treat two word together as one 32bit float data, it will be "123.456"
Binary --> 01000010111101101110100101111001
How to convert integer array [59759 , 17142] to float 123.456 in Javascript?
Note: [ X (16bit low word) , Y (16 bit high word) ] ==> Z (32bit float)
Share Improve this question edited Dec 5, 2016 at 9:27 Kirk Beard 9,84312 gold badges45 silver badges48 bronze badges asked Dec 5, 2016 at 9:15 Heng-Shou LiuHeng-Shou Liu 331 silver badge3 bronze badges2 Answers
Reset to default 8You can do this with typed arrays and ArrayBuffer
, which allow you to interpret the same bits in different ways (but endianness is platform-specific). It's also possible using a DataView
on the buffer, which lets you conrol endianness.
Here's the typed array approach which works with the endianness of my platform, see ments:
// Create a buffer
var buf = new ArrayBuffer(4);
// Create a 16-bit int view of it
var ints = new Uint16Array(buf);
// Fill in the values
ints[0] = 59759;
ints[1] = 17142;
// Create a 32-bit float view of it
var floats = new Float32Array(buf);
// Read the bits as a float; note that by doing this, we're implicitly
// converting it from a 32-bit float into JavaScript's native 64-bit double
var num = floats[0];
// Done
console.log(num);
Here's the DataView
approach, note writing the ints in the opposite order:
// Create a buffer
var buf = new ArrayBuffer(4);
// Create a data view of it
var view = new DataView(buf);
// Write the ints to it
view.setUint16(0, 17142);
view.setUint16(2, 59759);
// Read the bits as a float; note that by doing this, we're implicitly
// converting it from a 32-bit float into JavaScript's native 64-bit double
var num = view.getFloat32(0);
// Done
console.log(num);
Can shorter / make more efficient @T.J. Crowder answer by using array buffer directly :
var data = [59759, 17142];
// Create a buffer
var buf = new Uint16Array(data).buffer
// Create a data view of it
var view = new DataView(buf);
var num = view.getFloat32(0, true);
// Done
console.log(num);
本文标签: javascriptHow To Convert Two 16bit Integer (High WordLow Word) into 32bit FloatStack Overflow
版权声明:本文标题:javascript - How To Convert Two 16bit Integer (High WordLow Word) into 32bit Float? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://roclinux.cn/p/1744023466a2520221.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论