# json-bigint-arkts **Repository Path**: ArkTSCentralRepository/json-bigint-arkts ## Basic Information - **Project Name**: json-bigint-arkts - **Description**: json-bigint-arkts - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-07-07 - **Last Updated**: 2025-07-08 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # json-bigint 基于[json-bigint](https://www.npmjs.com/package/json-bigint?activeTab=readme)原库1.0.0版本进行适配, 所有功能代码已经转换为`ArkTS`文件 ## Install ```sh ohpm install json-bigint ``` ## Description JSON.parse/stringify with bigints support. Based on Douglas Crockford [JSON.js](https://github.com/douglascrockford/JSON-js) package and [bignumber.js](https://github.com/MikeMcl/bignumber.js) library. Native `Bigint` was added to JS recently, so we added an option to leverage it instead of `bignumber.js`. However, the parsing with native `BigInt` is kept an option for backward compability. While most JSON parsers assume numeric values have same precision restrictions as IEEE 754 double, JSON specification _does not_ say anything about number precision. Any floating point number in decimal (optionally scientific) notation is valid JSON value. It's a good idea to serialize values which might fall out of IEEE 754 integer precision as strings in your JSON api, but `{ "value" : 9223372036854775807}`, for example, is still a valid RFC4627 JSON string, and in most JS runtimes the result of `JSON.parse` is this object: `{ value: 9223372036854776000 }` ========== example: ```typescript import JsonBigint from 'json-bigint'; const json = '{ "value" : 9223372036854775807, "v2": 123 }'; console.log('Input:', json); console.log(''); console.log('node.js built-in JSON:'); let r: ESObject = JSON.parse(json); console.log('JSON.parse(input).value : ', r.value.toString()); console.log('JSON.stringify(JSON.parse(input)):', JSON.stringify(r)); const jsonBigint = new JsonBigint(); console.log('\n\nbig number JSON:'); let r1: ESObject = jsonBigint.parse(json); console.log('JSONbig.parse(input).value : ', r1.value.toString()); console.log('JSONbig.stringify(JSONbig.parse(input)):', jsonBigint.stringify(r1)); expect(r1.value.toString()).assertEqual('9223372036854775807'); expect(jsonBigint.stringify(r1)).assertEqual(`{"value":9223372036854775807,"v2":123}`); ``` Output: ``` Input: { "value" : 9223372036854775807, "v2": 123 } node.js built-in JSON: JSON.parse(input).value : 9223372036854776000 JSON.stringify(JSON.parse(input)): {"value":9223372036854776000,"v2":123} big number JSON: JSONbig.parse(input).value : 9223372036854775807 JSONbig.stringify(JSONbig.parse(input)): {"value":9223372036854775807,"v2":123} ``` ### Options The behaviour of the parser is somewhat configurable through 'options' #### options.strict, boolean, default false Specifies the parsing should be "strict" towards reporting duplicate-keys in the parsed string. The default follows what is allowed in standard json and resembles the behavior of JSON.parse, but overwrites any previous values with the last one assigned to the duplicate-key. Setting options.strict = true will fail-fast on such duplicate-key occurances and thus warn you upfront of possible lost information. example: ```typescript import JsonBigint from 'json-bigint'; const JSONbig = new JsonBigint(); const JSONstrict = new JsonBigint({ strict: true }); const dupkeys = '{ "dupkey": "value 1", "dupkey": "value 2"}'; console.log('\n\nDuplicate Key test with both lenient and strict JSON parsing'); console.log('Input:', dupkeys); const works: ESObject = JSONbig.parse(dupkeys); console.log('JSON.parse(dupkeys).dupkey: %s', works.dupkey); let fails = 'will stay like this'; try { fails = JSONstrict.parse(dupkeys); console.log('ERROR!! Should never get here'); } catch (e) { console.log( 'Succesfully catched expected exception on duplicate keys: %j', e ); } expect(fails).assertEqual('will stay like this'); ``` Output ``` Duplicate Key test with big number JSON Input: { "dupkey": "value 1", "dupkey": "value 2"} JSON.parse(dupkeys).dupkey: value 2 Succesfully catched expected exception on duplicate keys: {"name":"SyntaxError","message":"Duplicate key \"dupkey\"","at":33,"text":"{ \"dupkey\": \"value 1\", \"dupkey\": \"value 2\"}"} ``` #### options.storeAsString, boolean, default false Specifies if BigInts should be stored in the object as a string, rather than the default BigNumber. Note that this is a dangerous behavior as it breaks the default functionality of being able to convert back-and-forth without data type changes (as this will convert all BigInts to be-and-stay strings). example: ```typescript import JsonBigint from 'json-bigint'; const JSONbig = new JsonBigint(); const JSONbigString = new JsonBigint({ storeAsString: true }); const key = '{ "key": 1234567890123456789 }'; console.log('\n\nStoring the BigInt as a string, instead of a BigNumber'); console.log('Input:', key); const withInt: ESObject = JSONbig.parse(key); const withString: ESObject = JSONbigString.parse(key); console.log( 'Default type: %s, With option type: %s', typeof withInt.key, typeof withString.key ); expect(typeof withInt.key).assertEqual('object'); expect(typeof withString.key).assertEqual('string'); ``` Output ``` Storing the BigInt as a string, instead of a BigNumber Input: { "key": 1234567890123456789 } Default type: object, With option type: string ``` #### options.useNativeBigInt, boolean, default false Specifies if parser uses native BigInt instead of bignumber.js example: ```typescript import JsonBigint from 'json-bigint'; const JSONbig = new JsonBigint(); const JSONbigNative = new JsonBigint({ useNativeBigInt: true }); const key = '{ "key": 993143214321423154315154321 }'; console.log(`\n\nStoring the Number as native BigInt, instead of a BigNumber`); console.log('Input:', key); const normal: ESObject = JSONbig.parse(key); const nativeBigInt: ESObject = JSONbigNative.parse(key); expect(typeof normal.key).assertEqual('object'); expect(typeof nativeBigInt.key).assertEqual('bigint'); console.log( 'Default type: %s, With option type: %s', typeof normal.key, typeof nativeBigInt.key ); ``` Output ``` Storing the Number as native BigInt, instead of a BigNumber Input: { "key": 993143214321423154315154321 } Default type: object, With option type: bigint ``` #### options.alwaysParseAsBig, boolean, default false Specifies if all numbers should be stored as BigNumber. Note that this is a dangerous behavior as it breaks the default functionality of being able to convert back-and-forth without data type changes (as this will convert all Number to be-and-stay BigNumber) example: ```typescript import JsonBigint from 'json-bigint'; const JSONbig = new JsonBigint(); const JSONbigAlways = new JsonBigint({ alwaysParseAsBig: true }); const key = '{ "key": 123 }'; // there is no need for BigNumber by default, but we're forcing it console.log(`\n\nStoring the Number as a BigNumber, instead of a Number`); console.log('Input:', key); const normal: ESObject = JSONbig.parse(key); const always: ESObject = JSONbigAlways.parse(key); expect(typeof normal.key).assertEqual('number'); expect(typeof always.key).assertEqual('object'); console.log( 'Default type: %s, With option type: %s', typeof normal.key, typeof always.key ); ``` Output ``` Storing the Number as a BigNumber, instead of a Number Input: { "key": 123 } Default type: number, With option type: object ``` If you want to force all numbers to be parsed as native `BigInt` (you probably do! Otherwise any calulations become a real headache): ```typescript import JsonBigint from 'json-bigint'; const JSONbig = new JsonBigint({ alwaysParseAsBig: true, useNativeBigInt: true, }); ``` #### options.protoAction, boolean, default: "error". Possible values: "error", "ignore", "preserve" #### options.constructorAction, boolean, default: "error". Possible values: "error", "ignore", "preserve" Controls how `__proto__` and `constructor` properties are treated. If set to "error" they are not allowed and parse() call will throw an error. If set to "ignore" the prroperty and it;s value is skipped from parsing and object building. If set to "preserve" the `__proto__` property is set. One should be extra careful and make sure any other library consuming generated data is not vulnerable to prototype poisoning attacks.