Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
phonon
on Jan 16, 2023
|
parent
|
context
|
favorite
| on:
Supporting half-precision floats is really annoyin...
And on the Nvidia H100, FP32 runs at 67 teraFLOPS....and BFLOAT16/FP16 runs at
1,979
teraFLOPS.
https://www.nvidia.com/en-us/data-center/h100/
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
https://www.nvidia.com/en-us/data-center/h100/