User talk:Cqwrteur
About warning on <charconv>
I won't reject such waring if you want to add it to cppreference. However, I suggest these contents should be add to right places and should be slightly reworded, as listed follow:
- Status of implementations of <charconv> has been already listed in cpp/compiler support (and maintained by me, mainly). So there is no need to introduce it.
- The intent of P0067R5 is not introducing a new and fast converting algorithm. Instead it may be just attempting to get some performance improvement by ignoring locale.
- And the rest should be added to cpp/utility#Elementary string conversions and cpp/io, not page for the header.
And I wonder the reason why you wrote "the result is still not deterministic on EDCBIC machines".
--Fruderica (talk) 08:09, 6 January 2020 (PST)
The performance of charconv is still terrible since you still need output to stream in the end which basically eliminates all performance gain from charconv. GCC does not support this. So as clang.
charconv asks more than what stdio.h and iostream required, many of them do not have algorithms at all. (The result of charconv is requires to be deterministic and no precision lost) which is NOT possible. No algorithm in the world can achieve that.
It is an N+1 problem. stdio is problematic. Same with iostream. These fmt, charconv are all terrible ideas.
C++ should just deprecate stdio.h and iostream.
Here are my benchmarks to show why charconv does not work. https://bitbucket.org/ejsvifq_mabmip/fast_io/src/master/
Also, ASCII only does not work since they are using char const[] literals. On EBCDIC machines, the results are nowhere near predictable since the compiler will screw up string literals on these machines.
These are all N+1 problems. stdio.h, iostream, asio, process, charconv, fmt, codecvt. What more do we need to have???
We should just add a new io library and remove ALL of these shit.
- I don't believe there's any relation between <charconv> utilities and how the compiler deals with string literals, see also current wording.
- And I don't know whether there's any value can't be converted and recovered according to [charconv.to.chars]/2 theoretically. An LWG issue should be submitted if such value can be given. --Fruderica (talk) 17:04, 6 January 2020 (PST)
Just for example (Notice libstdc++ does not support floating point charconv): https://github.com/gcc-mirror/gcc/blob/master/libstdc%2B%2B-v3/include/bits/charconv.h
static constexpr char __digits[201] = "0001020304050607080910111213141516171819" "2021222324252627282930313233343536373839" "4041424344454647484950515253545556575859" "6061626364656667686970717273747576777879" "8081828384858687888990919293949596979899";
Compiler can explain string literals as anything. On EBCDIC compiler, the encoding is EBCDIC for charconv.
In my fast_io library, all these kinds of stuffs are either avoided or using u8 string literals.
I mean, all these bandfix towards std::FILE* or iostream is stupid. The entire stream/stdio.h situation is even worse than exceptions and RTTI. We desperately need a new IO library which eliminates all the use case of std::FILE* and iostream.