Rocksolid Light

News from da outaworlds

mail  files  register  groups  login

Message-ID:  

BOFH excuse #391: We already sent around a notice about that.


comp / comp.os.linux.misc / Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints

SubjectAuthor
* Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big 186282@ud0s4.net
+* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Richard Kettlewell
|+* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
||`* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Pancho
|| `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
||  +* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Computer Nerd Kev
||  |`- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
||  `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
||   `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
||    `- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
|`* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
| +- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Mike Scott
| `- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Richard Kettlewell
`* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Pancho
 `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
  +* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
  |`* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
  | +- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Pancho
  | `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Richard Kettlewell
  |  `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
  |   `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Richard Kettlewell
  |    `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282ud0s3
  |     `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Chris Ahlstrom
  |      `- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
  `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for rbowman
   `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Chris Ahlstrom
    `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for rbowman
     `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for 186282@ud0s4.net
      `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Chris Ahlstrom
       +- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Charlie Gibbs
       +* Less politics, more Linux! (was: Re: Well DUH ! AI People Finally Realize They Cvallor
       |`* Re: Less politics, more Linux!Chris Ahlstrom
       | `- Re: Less politics, more Linux!vallor
       +- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher
       `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for rbowman
        `* Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Chris Ahlstrom
         `- Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for The Natural Philosopher

Pages:12
Subject: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Sun, 13 Oct 2024 02:54 UTC
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!border-4.nntp.ord.giganews.com!border-2.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 13 Oct 2024 02:54:20 +0000
Newsgroups: comp.os.linux.misc
X-Mozilla-News-Host: news://news.west.earthlink.net:119
From: 186283@ud0s4.net (186282@ud0s4.net)
Subject: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Organization: wokiesux
Date: Sat, 12 Oct 2024 22:54:07 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
Lines: 54
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-CiqpRlKGCnRjaliNkGDXB8FGlla17aCSOYzT/rKA0R78C/8H+91unCTAkOstn3ilWhthT9LtonJm4nL!MtWxNWTGE0BMwEMLVxLfdRTSquZDUJzy4bsLWc80Tbl3+HUGnrPrfb/JwTUidjj3oWL0ytqaxf4Y!qucwxzJzoUj+eHvbKvWa
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html

A team of engineers at AI inference technology company
BitEnergy AI reports a method to reduce the energy needs
of AI applications by 95%. The group has published a
paper describing their new technique on the arXiv preprint
server.

As AI applications have gone mainstream, their use has
risen dramatically, leading to a notable rise in energy
needs and costs. LLMs such as ChatGPT require a lot of
computing power, which in turn means a lot of electricity
is needed to run them.

As just one example, ChatGPT now requires roughly 564 MWh
daily, or enough to power 18,000 American homes. As the
science continues to advance and such apps become more
popular, critics have suggested that AI applications might
be using around 100 TWh annually in just a few years, on
par with Bitcoin mining operations.

In this new effort, the team at BitEnergy AI claims that
they have found a way to dramatically reduce the amount
of computing required to run AI apps that does not result
in reduced performance.

The new technique is basic—instead of using complex
floating-point multiplication (FPM), the method uses integer
addition. Apps use FPM to handle extremely large or small
numbers, allowing applications to carry out calculations
using them with extreme precision. It is also the most
energy-intensive part of AI number crunching.

.. . .

The default use of floating-point really took off when
'neural networks' became popular in the 80s. Seemed the
ideal way to keep track of all the various weightings
and values.

But, floating-point operations use a huge amount of
CPU/NPU power.

Seems somebody finally realized that the 'extra resolution'
of floating-point was rarely necessary and you can just
use large integers instead. Integer math is FAST and uses
LITTLE power .....

I did one or two apps long back using a sort of "fuzzy
logic". All the books had examples showing the use of
floating-point for dealing with the 'fuzzy' values.
However I quickly figured out that 32-bit ints offered
more than enough resolution and were very quick - esp
on micro-controllers.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Richard Kettlewell
Newsgroups: comp.os.linux.misc
Organization: terraraq NNTP server
Date: Sun, 13 Oct 2024 09:15 UTC
References: 1
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!news.gegeweb.eu!gegeweb.org!nntp.terraraq.uk!.POSTED.tunnel.sfere.anjou.terraraq.org.uk!not-for-mail
From: invalid@invalid.invalid (Richard Kettlewell)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
Date: Sun, 13 Oct 2024 10:15:54 +0100
Organization: terraraq NNTP server
Message-ID: <wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Info: innmantic.terraraq.uk; posting-host="tunnel.sfere.anjou.terraraq.org.uk:172.17.207.6";
logging-data="62598"; mail-complaints-to="usenet@innmantic.terraraq.uk"
User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/28.2 (gnu/linux)
Cancel-Lock: sha1:ZaPoaLEQpFVV2HU5vUA4Uhqx9UI=
X-Face: h[Hh-7npe<<b4/eW[]sat,I3O`t8A`(ej.H!F4\8|;ih)`7{@:A~/j1}gTt4e7-n*F?.Rl^
F<\{jehn7.KrO{!7=:(@J~]<.[{>v9!1<qZY,{EJxg6?Er4Y7Ng2\Ft>Z&W?r\c.!4DXH5PWpga"ha
+r0NzP?vnz:e/knOY)PI-
X-Boydie: NO
View all headers

"186282@ud0s4.net" <186283@ud0s4.net> writes:
> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
[...]
> The default use of floating-point really took off when
> 'neural networks' became popular in the 80s. Seemed the
> ideal way to keep track of all the various weightings
> and values.
>
> But, floating-point operations use a huge amount of
> CPU/NPU power.
>
> Seems somebody finally realized that the 'extra resolution'
> of floating-point was rarely necessary and you can just
> use large integers instead. Integer math is FAST and uses
> LITTLE power .....

That’s situational. In this case, the paper isn’t about using large
integers, it’s about very low precision floating point representations.
They’ve just found a way to approximate floating point multiplication
without multiplying the fractional parts of the mantissas.

--
https://www.greenend.org.uk/rjk/

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Pancho
Newsgroups: comp.os.linux.misc
Organization: A noiseless patient Spider
Date: Sun, 13 Oct 2024 10:45 UTC
References: 1
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Pancho.Jones@proton.me (Pancho)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Sun, 13 Oct 2024 11:45:46 +0100
Organization: A noiseless patient Spider
Lines: 32
Message-ID: <veg8cq$k36i$1@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 13 Oct 2024 12:45:47 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="96eaabfe285b27060ffbe483cf339a56";
logging-data="658642"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18KU+ajn0mo67kIrTU2/nTxKhm9ZVsHAcc="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:094MFJZBI7PL3rbN5NnZOuKS4pM=
Content-Language: en-GB
In-Reply-To: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
View all headers

On 10/13/24 03:54, 186282@ud0s4.net wrote:

> The new technique is basic—instead of using complex
> floating-point multiplication (FPM), the method uses integer
> addition. Apps use FPM to handle extremely large or small
> numbers, allowing applications to carry out calculations
> using them with extreme precision. It is also the most
> energy-intensive part of AI number crunching.
>

That isn't really true. Floats can handle big and small, but the reason
people use them is for simplicity.

The problem is that typical integer calculations are not closed, the
result is not an integer. Addition is fine, but the result of division
is typically not an integer. So if you use integers to model a problem
every time you do a division (or exp, log, sin, etc) you need to make a
decision about how to force the result into an integer.

Floats actually use integral values for exponent and mantissa, but they
automatically make ballpark reasonable decisions about how to force the
results into integral values for mantissa and exponent, meaning
operations are effectively closed (ignoring exceptions). So the
programmer doesn't have to worry, so much.

Floating point ops are actually quite efficient, much less of a concern
than something like a branch misprediction. A 20x speed up (energy
saving) sounds close to a theoretical maximum. I would be surprised if
it can be achieved in anything but a few cases.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: The Natural Philosop
Newsgroups: comp.os.linux.misc
Organization: A little, after lunch
Date: Sun, 13 Oct 2024 12:25 UTC
References: 1 2
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: tnp@invalid.invalid (The Natural Philosopher)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Sun, 13 Oct 2024 13:25:26 +0100
Organization: A little, after lunch
Lines: 32
Message-ID: <vege7m$lobb$9@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 13 Oct 2024 14:25:27 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="a26b8e86e055ce1849ca17ac39217bee";
logging-data="713067"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19YdGxpmYBNQ26/ygToOxPjXOE6CG28id4="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:tUCwX4qqiurA95Tu5ESzrdeJTro=
In-Reply-To: <wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
Content-Language: en-GB
View all headers

On 13/10/2024 10:15, Richard Kettlewell wrote:
> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
> [...]
>> The default use of floating-point really took off when
>> 'neural networks' became popular in the 80s. Seemed the
>> ideal way to keep track of all the various weightings
>> and values.
>>
>> But, floating-point operations use a huge amount of
>> CPU/NPU power.
>>
>> Seems somebody finally realized that the 'extra resolution'
>> of floating-point was rarely necessary and you can just
>> use large integers instead. Integer math is FAST and uses
>> LITTLE power .....
>
> That’s situational. In this case, the paper isn’t about using large
> integers, it’s about very low precision floating point representations.
> They’ve just found a way to approximate floating point multiplication
> without multiplying the fractional parts of the mantissas.
>
Last I heard they were going to use D to As feeding analog multipliers.
And convert back to D afterwards. for a speed/ precision tradeoff.

--
There is nothing a fleet of dispatchable nuclear power plants cannot do
that cannot be done worse and more expensively and with higher carbon
emissions and more adverse environmental impact by adding intermittent
renewable energy.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Pancho
Newsgroups: comp.os.linux.misc
Organization: A noiseless patient Spider
Date: Sun, 13 Oct 2024 13:23 UTC
References: 1 2 3
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Pancho.Jones@proton.me (Pancho)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Sun, 13 Oct 2024 14:23:25 +0100
Organization: A noiseless patient Spider
Lines: 28
Message-ID: <veghkd$mhii$1@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 13 Oct 2024 15:23:26 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="96eaabfe285b27060ffbe483cf339a56";
logging-data="738898"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+E/556Ws/eHxGSt0K6uku9levfU0YKwuU="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:bYvKdSg846OcT9B4tDyoGmm4W24=
In-Reply-To: <vege7m$lobb$9@dont-email.me>
Content-Language: en-GB
View all headers

On 10/13/24 13:25, The Natural Philosopher wrote:
> On 13/10/2024 10:15, Richard Kettlewell wrote:
>> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
>> [...]
>>>    The default use of floating-point really took off when
>>>    'neural networks' became popular in the 80s. Seemed the
>>>    ideal way to keep track of all the various weightings
>>>    and values.
>>>
>>>    But, floating-point operations use a huge amount of
>>>    CPU/NPU power.
>>>
>>>    Seems somebody finally realized that the 'extra resolution'
>>>    of floating-point was rarely necessary and you can just
>>>    use large integers instead. Integer math is FAST and uses
>>>    LITTLE power .....
>>
>> That’s situational. In this case, the paper isn’t about using large
>> integers, it’s about very low precision floating point representations.
>> They’ve just found a way to approximate floating point multiplication
>> without multiplying the fractional parts of the mantissas.
>>
> Last I heard they were going to use D to As feeding analog multipliers.
> And convert back to D afterwards. for a speed/ precision tradeoff.
>

That sounds like the 1960s. I guess this idea does sound like a slide rule.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: The Natural Philosop
Newsgroups: comp.os.linux.misc
Organization: A little, after lunch
Date: Mon, 14 Oct 2024 10:16 UTC
References: 1 2 3 4
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: tnp@invalid.invalid (The Natural Philosopher)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Mon, 14 Oct 2024 11:16:48 +0100
Organization: A little, after lunch
Lines: 45
Message-ID: <veir2g$156pd$5@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
<veghkd$mhii$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 14 Oct 2024 12:16:49 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="8b74e17393a8062da5b10ce0a557d9f1";
logging-data="1219373"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/aOZO6bMUQmYZUFGCJj1zDryjQeHzvaNY="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:825zYwbH2+t/98kEa/Vqdw/54pI=
In-Reply-To: <veghkd$mhii$1@dont-email.me>
Content-Language: en-GB
View all headers

On 13/10/2024 14:23, Pancho wrote:
> On 10/13/24 13:25, The Natural Philosopher wrote:
>> On 13/10/2024 10:15, Richard Kettlewell wrote:
>>> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>>>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
>>> [...]
>>>>    The default use of floating-point really took off when
>>>>    'neural networks' became popular in the 80s. Seemed the
>>>>    ideal way to keep track of all the various weightings
>>>>    and values.
>>>>
>>>>    But, floating-point operations use a huge amount of
>>>>    CPU/NPU power.
>>>>
>>>>    Seems somebody finally realized that the 'extra resolution'
>>>>    of floating-point was rarely necessary and you can just
>>>>    use large integers instead. Integer math is FAST and uses
>>>>    LITTLE power .....
>>>
>>> That’s situational. In this case, the paper isn’t about using large
>>> integers, it’s about very low precision floating point representations.
>>> They’ve just found a way to approximate floating point multiplication
>>> without multiplying the fractional parts of the mantissas.
>>>
>> Last I heard they were going to use D to As feeding analog
>> multipliers. And convert back to D afterwards. for a speed/ precision
>> tradeoff.
>>
>
> That sounds like the 1960s. I guess this idea does sound like a slide rule.

No, apparently its a new (sic!) idea.

I think that even if it does not work successfully it is great that
people are thinking outside the box.
Analogue computers could offer massive parallelism for simulating
complex dynamic systems.

--
There’s a mighty big difference between good, sound reasons and reasons
that sound good.

Burton Hillis (William Vaughn, American columnist)

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Computer Nerd Kev
Newsgroups: comp.os.linux.misc
Organization: Ausics - https://newsgroups.ausics.net
Date: Mon, 14 Oct 2024 21:10 UTC
References: 1 2 3 4 5
Message-ID: <670d88c7@news.ausics.net>
From: not@telling.you.invalid (Computer Nerd Kev)
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com> <wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me> <veghkd$mhii$1@dont-email.me> <veir2g$156pd$5@dont-email.me>
User-Agent: tin/2.0.1-20111224 ("Achenvoir") (UNIX) (Linux/2.4.31 (i586))
NNTP-Posting-Host: news.ausics.net
Date: 15 Oct 2024 07:10:32 +1000
Organization: Ausics - https://newsgroups.ausics.net
Lines: 11
X-Complaints: abuse@ausics.net
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!news.bbs.nz!news.ausics.net!not-for-mail
View all headers

The Natural Philosopher <tnp@invalid.invalid> wrote:
> Analogue computers could offer massive parallelism for simulating
> complex dynamic systems.

If they have a solution for the typical problem of noise in the
analogue signals drowning out the "complex" simulations. Optical
methods are interesting.

--
__ __
#_ < |\| |< _#

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: The Natural Philosop
Newsgroups: comp.os.linux.misc
Organization: A little, after lunch
Date: Mon, 14 Oct 2024 22:47 UTC
References: 1 2 3 4 5 6
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: tnp@invalid.invalid (The Natural Philosopher)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Mon, 14 Oct 2024 23:47:11 +0100
Organization: A little, after lunch
Lines: 19
Message-ID: <vek71f$1cc39$1@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
<veghkd$mhii$1@dont-email.me> <veir2g$156pd$5@dont-email.me>
<670d88c7@news.ausics.net>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Tue, 15 Oct 2024 00:47:12 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="2fa6c5cfa1887a8c7cd6f9c27a470542";
logging-data="1454185"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18X6rpKruBVFsRAWW03H6/YCaDPm4TL3Ic="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:KE0RsTtNiXuCuF1DclLwy1cwcrM=
Content-Language: en-GB
In-Reply-To: <670d88c7@news.ausics.net>
View all headers

On 14/10/2024 22:10, Computer Nerd Kev wrote:
> The Natural Philosopher <tnp@invalid.invalid> wrote:
>> Analogue computers could offer massive parallelism for simulating
>> complex dynamic systems.
>
> If they have a solution for the typical problem of noise in the
> analogue signals drowning out the "complex" simulations. Optical
> methods are interesting.
>

If they don't, then that is in itself a valuable indication that it tow
runs give different results they are modelling a chaotic system.
It doesn't matter how much precision you put on junk data, its still junk.

--
"It is an established fact to 97% confidence limits that left wing
conspirators see right wing conspiracies everywhere"

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Tue, 15 Oct 2024 06:30 UTC
References: 1 2
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-3.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Tue, 15 Oct 2024 06:30:13 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Tue, 15 Oct 2024 02:30:12 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <Gv-dnUpOocxolpP6nZ2dnZfqn_WdnZ2d@earthlink.com>
Lines: 31
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-1bal47Q0nxUZxwhbJKt8lbbH8r6oNt9LKfvNWO0/bpbpt4VeLLU8TbTsaV9OhUeGIRyHDu3MydAZaO/!F+V3R/8EiLm2VB/kk7XUcomyfu/xdHBje0KgFjnajBpyXr2fYnAPxiiv65H4P8d4OK7woLVdCsxr!MVEd8b4Vzf98Clu+BJ7i
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/13/24 5:15 AM, Richard Kettlewell wrote:
> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
> [...]
>> The default use of floating-point really took off when
>> 'neural networks' became popular in the 80s. Seemed the
>> ideal way to keep track of all the various weightings
>> and values.
>>
>> But, floating-point operations use a huge amount of
>> CPU/NPU power.
>>
>> Seems somebody finally realized that the 'extra resolution'
>> of floating-point was rarely necessary and you can just
>> use large integers instead. Integer math is FAST and uses
>> LITTLE power .....
>
> That’s situational. In this case, the paper isn’t about using large
> integers, it’s about very low precision floating point representations.
> They’ve just found a way to approximate floating point multiplication
> without multiplying the fractional parts of the mantissas.

They need to take it further - integers instead
of ANY floating-point absolutely anywhere possible.

The greenies have begun to freak over the sheer electric
power required by "AI" systems. It IS rather a lot. It's
getting worse than even bitcoin mining now. Judging by
the article, a large percentage of that energy is going
into un-needed floating-point calx.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Tue, 15 Oct 2024 06:31 UTC
References: 1 2 3 4 5
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-2.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-3.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Tue, 15 Oct 2024 06:31:58 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
<veghkd$mhii$1@dont-email.me> <veir2g$156pd$5@dont-email.me>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Tue, 15 Oct 2024 02:31:58 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <veir2g$156pd$5@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <Gv-dnUVOoczDkZP6nZ2dnZfqn_UAAAAA@earthlink.com>
Lines: 46
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-LR3paQTbq7aZYuFY/L4IXrsgbaDVHswbfgrvH5OX/KMP4ItAFygl1z9zsDmWEz5yic6ItGCWlm1WtYu!83sMlRO9fXq/ieH3ZuN65ajuo8p7M6tJsSUaBrvQ8QYgip8A2pAuCzZzdJ+jOND/F05zPXSaICyx!Y0ELFfgB6VjrsH5vib8T
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/14/24 6:16 AM, The Natural Philosopher wrote:
> On 13/10/2024 14:23, Pancho wrote:
>> On 10/13/24 13:25, The Natural Philosopher wrote:
>>> On 13/10/2024 10:15, Richard Kettlewell wrote:
>>>> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>>>>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
>>>>>
>>>> [...]
>>>>>    The default use of floating-point really took off when
>>>>>    'neural networks' became popular in the 80s. Seemed the
>>>>>    ideal way to keep track of all the various weightings
>>>>>    and values.
>>>>>
>>>>>    But, floating-point operations use a huge amount of
>>>>>    CPU/NPU power.
>>>>>
>>>>>    Seems somebody finally realized that the 'extra resolution'
>>>>>    of floating-point was rarely necessary and you can just
>>>>>    use large integers instead. Integer math is FAST and uses
>>>>>    LITTLE power .....
>>>>
>>>> That’s situational. In this case, the paper isn’t about using large
>>>> integers, it’s about very low precision floating point representations.
>>>> They’ve just found a way to approximate floating point multiplication
>>>> without multiplying the fractional parts of the mantissas.
>>>>
>>> Last I heard they were going to use D to As feeding analog
>>> multipliers. And convert back to D afterwards. for a speed/ precision
>>> tradeoff.
>>>
>>
>> That sounds like the 1960s. I guess this idea does sound like a slide
>> rule.
>
> No, apparently its a new (sic!) idea.
>
> I think that even if it does not work successfully it is great that
> people are thinking outside the box.
> Analogue computers could offer massive parallelism for simulating
> complex dynamic systems.

Yea, but not much PRECISION beyond a stage or two
of calx :-)

No "perfect" fixes.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Tue, 15 Oct 2024 06:43 UTC
References: 1 2
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-2.nntp.ord.giganews.com!nntp.giganews.com!local-3.nntp.ord.giganews.com!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Tue, 15 Oct 2024 06:43:09 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Tue, 15 Oct 2024 02:43:08 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <veg8cq$k36i$1@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
Lines: 51
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-NbswghZ+jWuOv3e4h1fVldBKkeqRDYHvz0pb0NmALxJtBDHW+eYEC+ywv+4fdPDLOBDDc4A376KMKBM!o/yLrnJRLvnhkEBomyDedT9asjMbWeWJN/Hvf+zcRvsLqFmnlSlOTkasDTMZPH+ZyI+3q4nMkn0C!L4Qm/9I8okpq0JOpVIA+
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/13/24 6:45 AM, Pancho wrote:
> On 10/13/24 03:54, 186282@ud0s4.net wrote:
>
>> The new technique is basic—instead of using complex
>> floating-point multiplication (FPM), the method uses integer
>> addition. Apps use FPM to handle extremely large or small
>> numbers, allowing applications to carry out calculations
>> using them with extreme precision. It is also the most
>> energy-intensive part of AI number crunching.
>>
>
> That isn't really true. Floats can handle big and small, but the reason
> people use them is for simplicity.

"Simple", usually. Energy/time-efficient ... not so much.

> The problem is that typical integer calculations are not closed, the
> result is not an integer. Addition is fine, but the result of division
> is typically not an integer. So if you use integers to model a problem
> every time you do a division (or exp, log, sin, etc) you need to make a
> decision about how to force the result into an integer.

The question is how EXACT the precision HAS to be for
most "AI" uses. Might be safe to throw away a few
decimal points at the bottom.

> Floats actually use integral values for exponent and mantissa, but they
> automatically make ballpark reasonable decisions about how to force the
> results into integral values for mantissa and exponent, meaning
> operations are effectively closed (ignoring exceptions).  So the
> programmer doesn't have to worry, so much.
>
> Floating point ops are actually quite efficient, much less of a concern
> than something like a branch misprediction. A 20x speed up (energy
> saving) sounds close to a theoretical maximum. I would be surprised if
> it can be achieved in anything but a few cases.

Well ... the article insists they are NOT energy-efficient,
esp when performed en-masse. I think their prelim tests
suggested an almost 95% savings (sometimes).

Anyway, at least the IDEA is back out there again. We
old guys, oft dealing with microcontrollers, knew the
advantages of wider integers over even 'small' FP.

Math processors disguised the amount of processing
required for FP ... but it was STILL there.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Mike Scott
Newsgroups: comp.os.linux.misc
Organization: Scott family
Date: Tue, 15 Oct 2024 07:52 UTC
References: 1 2 3
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: usenet.16@scottsonline.org.uk.invalid (Mike Scott)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Tue, 15 Oct 2024 08:52:03 +0100
Organization: Scott family
Lines: 26
Message-ID: <vel6v3$1khip$1@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
<Gv-dnUpOocxolpP6nZ2dnZfqn_WdnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 15 Oct 2024 09:52:03 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="20cbc916c66dd3b1db78f1f79cbc63a6";
logging-data="1721945"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX199m9yI+6SYHhoWGuQKl1U1L4wdENSFdo0="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:7Ffn6DsnTHynVyKDd9Q8F0vbeto=
Content-Language: en-GB
In-Reply-To: <Gv-dnUpOocxolpP6nZ2dnZfqn_WdnZ2d@earthlink.com>
View all headers

On 15/10/2024 07:30, 186282@ud0s4.net wrote:
>>
>> That’s situational. In this case, the paper isn’t about using large
>> integers, it’s about very low precision floating point representations.
>> They’ve just found a way to approximate floating point multiplication
>> without multiplying the fractional parts of the mantissas.
>
>
>   They need to take it further - integers instead
>   of ANY floating-point absolutely anywhere possible.

Reminds me of PDP8 days.

We were doing fft's by the million. All done in 12-bit integer
arithmetic with a block exponent. Lookup tables for logs were simple
enough, as were trig functions. Not that anything was exactly "fast" --
IIRC a 1.2usec basic instruction cycle.

The machine did have a FP unit, but it was too s..l..o..w.. by far for this.

The circle goes around.

--
Mike Scott
Harlow, England

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Richard Kettlewell
Newsgroups: comp.os.linux.misc
Organization: terraraq NNTP server
Date: Tue, 15 Oct 2024 08:14 UTC
References: 1 2 3
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!news.gegeweb.eu!gegeweb.org!nntp.terraraq.uk!.POSTED.tunnel.sfere.anjou.terraraq.org.uk!not-for-mail
From: invalid@invalid.invalid (Richard Kettlewell)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
Date: Tue, 15 Oct 2024 09:14:40 +0100
Organization: terraraq NNTP server
Message-ID: <wwv8qupaikv.fsf@LkoBDZeT.terraraq.uk>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk>
<Gv-dnUpOocxolpP6nZ2dnZfqn_WdnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Info: innmantic.terraraq.uk; posting-host="tunnel.sfere.anjou.terraraq.org.uk:172.17.207.6";
logging-data="29324"; mail-complaints-to="usenet@innmantic.terraraq.uk"
User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/28.2 (gnu/linux)
Cancel-Lock: sha1:uA3bzEn+RnwAcfgOlsFp87xVShE=
X-Face: h[Hh-7npe<<b4/eW[]sat,I3O`t8A`(ej.H!F4\8|;ih)`7{@:A~/j1}gTt4e7-n*F?.Rl^
F<\{jehn7.KrO{!7=:(@J~]<.[{>v9!1<qZY,{EJxg6?Er4Y7Ng2\Ft>Z&W?r\c.!4DXH5PWpga"ha
+r0NzP?vnz:e/knOY)PI-
X-Boydie: NO
View all headers

"186282@ud0s4.net" <186283@ud0s4.net> writes:
> On 10/13/24 5:15 AM, Richard Kettlewell wrote:
>> "186282@ud0s4.net" <186283@ud0s4.net> writes:
>>> https://techxplore.com/news/2024-10-integer-addition-algorithm-energy-ai.html
>> [...]
>>> The default use of floating-point really took off when
>>> 'neural networks' became popular in the 80s. Seemed the
>>> ideal way to keep track of all the various weightings
>>> and values.
>>>
>>> But, floating-point operations use a huge amount of
>>> CPU/NPU power.
>>>
>>> Seems somebody finally realized that the 'extra resolution'
>>> of floating-point was rarely necessary and you can just
>>> use large integers instead. Integer math is FAST and uses
>>> LITTLE power .....
>> That’s situational. In this case, the paper isn’t about using large
>> integers, it’s about very low precision floating point representations.
>> They’ve just found a way to approximate floating point multiplication
>> without multiplying the fractional parts of the mantissas.
>
> They need to take it further - integers instead
> of ANY floating-point absolutely anywhere possible.

Perhaps you could publish your alternative algorithm that satisfies
their use case.

--
https://www.greenend.org.uk/rjk/

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: The Natural Philosop
Newsgroups: comp.os.linux.misc
Organization: A little, after lunch
Date: Tue, 15 Oct 2024 11:03 UTC
References: 1 2 3 4 5 6
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: tnp@invalid.invalid (The Natural Philosopher)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Tue, 15 Oct 2024 12:03:32 +0100
Organization: A little, after lunch
Lines: 41
Message-ID: <veli64$1m3bg$3@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
<veghkd$mhii$1@dont-email.me> <veir2g$156pd$5@dont-email.me>
<Gv-dnUVOoczDkZP6nZ2dnZfqn_UAAAAA@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 15 Oct 2024 13:03:32 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="2fa6c5cfa1887a8c7cd6f9c27a470542";
logging-data="1772912"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+AX0BItUo9OMZXFKhRdVcz4gkDvYThlaI="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:RqVLoBYzKI1pDVv2O1oN0YvxcDQ=
Content-Language: en-GB
In-Reply-To: <Gv-dnUVOoczDkZP6nZ2dnZfqn_UAAAAA@earthlink.com>
View all headers

On 15/10/2024 07:31, 186282@ud0s4.net wrote:
> On 10/14/24 6:16 AM, The Natural Philosopher wrote:

>> I think that even if it does not work successfully it is great that
>> people are thinking outside the box.
>> Analogue computers could offer massive parallelism for simulating
>> complex dynamic systems.
>
>
>   Yea, but not much PRECISION beyond a stage or two
>   of calx  :-)
>
>   No "perfect" fixes.

As I said, let's say we are simulating airflow over a fast moving
object - now normally the fluid dynamics CFM is crap and it is cheaper
and more accurate to throw it in a wind tunnel.

The wind tunnel is not measuiring data to any high accuracy but its
using atomic level measurement cells in enormous quantities in parallel.

The problem with CFM is you cant have too may 'cells' or you run out of
computer power. Its a step beyond 3D modelling where the more triangles
you have the closer to real everything looks, but its a similar problem .

But a wind tunnel built out of analogue 'cells' might be quite simple in
concept. Just large in silicon scale.

And it wouldn't need to be 'programmed' as its internal logic would be
constructed to be the equations that govern fluid dynamics. All you
would then do is take a 3D surface and constrain every cell in that
computer on that surface to have zero output.

If I were a graduate again that's a PhD project that would appeal...

--
There is nothing a fleet of dispatchable nuclear power plants cannot do
that cannot be done worse and more expensively and with higher carbon
emissions and more adverse environmental impact by adding intermittent
renewable energy.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: The Natural Philosop
Newsgroups: comp.os.linux.misc
Organization: A little, after lunch
Date: Tue, 15 Oct 2024 11:06 UTC
References: 1 2 3
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: tnp@invalid.invalid (The Natural Philosopher)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Tue, 15 Oct 2024 12:06:30 +0100
Organization: A little, after lunch
Lines: 17
Message-ID: <velibm$1m3bg$4@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 15 Oct 2024 13:06:31 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="2fa6c5cfa1887a8c7cd6f9c27a470542";
logging-data="1772912"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/j7tWPPCEzk1pwQIurwrjch5l0XvPnBYE="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:bKeZEqKqHYr4UWXXXirtfCLn9dQ=
Content-Language: en-GB
In-Reply-To: <LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
View all headers

On 15/10/2024 07:43, 186282@ud0s4.net wrote:
> The question is how EXACT the precision HAS to be for
>   most "AI" uses. Might be safe to throw away a few
>   decimal points at the bottom.

My thesis is that *in some applications*, more low quality calculations
bets a fewer high quality ones anyway.
I wasn't thinkingof AI, as much as modelling complex turbulent flow in
aero and hydrodynamics or weather forecasting
--
Outside of a dog, a book is a man's best friend. Inside of a dog it's
too dark to read.

Groucho Marx

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: rbowman
Newsgroups: comp.os.linux.misc
Date: Tue, 15 Oct 2024 19:20 UTC
References: 1 2 3
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: bowman@montana.com (rbowman)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: 15 Oct 2024 19:20:32 GMT
Lines: 17
Message-ID: <ln7tk0FijhpU1@mid.individual.net>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net HFoTFb+5Y0yngSie3LE+BgRJ4/Wr56s9CQCK+3sjx5awN0xOHY
Cancel-Lock: sha1:popUQtQeJXqailKL2pps5BD0Rdw= sha256:I/2k3kv963ri6oQkk3L8a855t+nj3sBKhamSCHxvM7A=
User-Agent: Pan/0.149 (Bellevue; 4c157ba)
View all headers

On Tue, 15 Oct 2024 02:43:08 -0400, 186282@ud0s4.net wrote:

> The question is how EXACT the precision HAS to be for most "AI" uses.
> Might be safe to throw away a few decimal points at the bottom.

It's usually referred to as 'machine learning' rather than AI but when you
look at TinyML on edge devices doing image recognition, wake word
processing, and other tasks it's impressive how much you can throw away
and still get a reasonable quality of results.

https://www.tinyml.org/

This goes back to the slide rule days. Sure, you could whip out your book
of six place tables and get seemingly more accurate results but did all
those decimal places mean anything in the real world? Computers took the
pain out of calculations but also tended to avoid the questions of 'what
does this really mean in the real world'.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Chris Ahlstrom
Newsgroups: comp.os.linux.misc
Organization: None
Date: Tue, 15 Oct 2024 19:46 UTC
References: 1 2 3 4
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: OFeem1987@teleworm.us (Chris Ahlstrom)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Tue, 15 Oct 2024 15:46:05 -0400
Organization: None
Lines: 32
Message-ID: <vemgqn$1qp5j$3@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<ln7tk0FijhpU1@mid.individual.net>
Reply-To: OFeem1987@teleworm.us
Injection-Date: Tue, 15 Oct 2024 21:46:31 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="d111ad58541a3cd4e1efd755314c39e0";
logging-data="1926323"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+GxySlAl1ExVGls7kAH8My"
User-Agent: slrn/1.0.3 (Linux)
Cancel-Lock: sha1:DfqY8p5+/vAR6Nkf4C2FbRYxp04=
X-Mutt: The most widely-used MUA
X-User-Agent: Microsoft Outl00k, Usenet K00k Editions
X-Face: 63n<76,LYJQ2m#'5YL#.T95xqyPiG`ffIP70tN+j"(&@6(4l\7uL)2+/-r0)/9SjZ`qw=
Njn mr93Xrerx}aQG-Ap5IHn"xe;`5:pp"$RH>Kx_ngWw%c\+6qSg!q"41n2[.N/;Pu6q8?+Poz~e
A9? $6_R7cm.l!s8]yfv7x+-FYQ|/k
X-Slrn: Why use anything else?
View all headers

rbowman wrote this copyrighted missive and expects royalties:

> On Tue, 15 Oct 2024 02:43:08 -0400, 186282@ud0s4.net wrote:
>
>> The question is how EXACT the precision HAS to be for most "AI" uses.
>> Might be safe to throw away a few decimal points at the bottom.
>
> It's usually referred to as 'machine learning' rather than AI but when you
> look at TinyML on edge devices doing image recognition, wake word
> processing, and other tasks it's impressive how much you can throw away
> and still get a reasonable quality of results.
>
> https://www.tinyml.org/
>
> This goes back to the slide rule days. Sure, you could whip out your book
> of six place tables and get seemingly more accurate results but did all
> those decimal places mean anything in the real world? Computers took the
> pain out of calculations but also tended to avoid the questions of 'what
> does this really mean in the real world'.

In high school chemistry, we learned how to apply uncertainty ranges (plus or
minus) to measurements and how to accumulate ranges based on multiple
measurements.

The political polls state ranges, but nothing about the alpha, the N, and,
most importantly, the wording of the poll questions and the nature of the
sampling.

--
It was the Law of the Sea, they said. Civilization ends at the waterline.
Beyond that, we all enter the food chain, and not always right at the top.
-- Hunter S. Thompson

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: rbowman
Newsgroups: comp.os.linux.misc
Date: Wed, 16 Oct 2024 02:35 UTC
References: 1 2 3 4 5
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: bowman@montana.com (rbowman)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: 16 Oct 2024 02:35:52 GMT
Lines: 17
Message-ID: <ln8n48FmbidU1@mid.individual.net>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<ln7tk0FijhpU1@mid.individual.net> <vemgqn$1qp5j$3@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net j9rxuJBaeW2Iq/lRe00hnAjGTv4+fdaJ+fuk3dFx7mocazpKju
Cancel-Lock: sha1:O4sxREJre6nTKAn1Gdo4I3DmDw8= sha256:AmQb5iE2GPJ2TmP8CaqxN4TYY4igSKLR4jOlzjNuTdY=
User-Agent: Pan/0.149 (Bellevue; 4c157ba)
View all headers

On Tue, 15 Oct 2024 15:46:05 -0400, Chris Ahlstrom wrote:

> The political polls state ranges, but nothing about the alpha, the N,
> and,
> most importantly, the wording of the poll questions and the nature of
> the sampling.

I try to ignore polls and most of the hype. A few years back I went to bed
expecting Hillary Clinton to be the president elect when I woke up. The DJ
on the radio station I listen to morning was a definite lefty. When he
played Norah Jones' 'Carry On' I found I'd been mistaken.

https://www.youtube.com/watch?v=DqA25Ug71Mc

"Let's just forget
Leave it behind
And carry on."

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Wed, 16 Oct 2024 06:38 UTC
References: 1 2 3 4
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!border-1.nntp.ord.giganews.com!nntp.giganews.com!local-1.nntp.ord.giganews.com!Xl.tags.giganews.com!local-4.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Wed, 16 Oct 2024 06:38:09 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<velibm$1m3bg$4@dont-email.me>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Wed, 16 Oct 2024 02:38:08 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <velibm$1m3bg$4@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <VEGdnTMGMJLMwpL6nZ2dnZfqn_adnZ2d@earthlink.com>
Lines: 27
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-95Rww+L+9+LcgV2vU9FOwduWvfH8t168mGCRBNTGef4JIAHm8BGMn4LJqZTnlU/AfgBRr86xyPHe4ZX!mOyHtB7lErgTtYPx+xDEKzTNnBiWeQyh6Xh6YOxAIs7PbZoIIgbEqp6Hp+O9WisV9yVjxTtBLKoB!bSUldCd7atOD/a3JZksu
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/15/24 7:06 AM, The Natural Philosopher wrote:
> On 15/10/2024 07:43, 186282@ud0s4.net wrote:
>> The question is how EXACT the precision HAS to be for
>>    most "AI" uses. Might be safe to throw away a few
>>    decimal points at the bottom.
>
> My thesis is that *in some applications*, more low quality calculations
> bets a fewer high quality ones anyway.
> I wasn't thinking of AI, as much as modelling complex turbulent flow in
> aero and hydrodynamics or weather forecasting

Well, weather, any decimal points are BS anyway :-)

However, AI and fuzzy logic and neural networks - it
has just been standard practice to use floats to handle
all values. I've got books going back into the mid 80s
on all those and you JUST USED floats.

BUT ... as said, even a 32-bit int can handle fairly
large vals. Mult little vals by 100 or 1000 and you can
throw away the need for decimal points - and the POWER
required to do such calx. Accuracy should be more than
adequate.

In any case, I'm happy SOMEONE finally realized this.

TOOK a really LONG time though ......

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Wed, 16 Oct 2024 06:54 UTC
References: 1 2 3 4 5 6 7
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!border-4.nntp.ord.giganews.com!nntp.giganews.com!local-1.nntp.ord.giganews.com!Xl.tags.giganews.com!local-4.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Wed, 16 Oct 2024 06:54:05 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<wwv5xpw8it1.fsf@LkoBDZeT.terraraq.uk> <vege7m$lobb$9@dont-email.me>
<veghkd$mhii$1@dont-email.me> <veir2g$156pd$5@dont-email.me>
<Gv-dnUVOoczDkZP6nZ2dnZfqn_UAAAAA@earthlink.com>
<veli64$1m3bg$3@dont-email.me>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Wed, 16 Oct 2024 02:54:04 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <veli64$1m3bg$3@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <TmudnQycjs2Q_pL6nZ2dnZfqn_ednZ2d@earthlink.com>
Lines: 55
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-zJ5g/UAivSdBsfiCmyz4+AVE6nnxcfQSJLlabsFUEm+6UskkmiBTlnB/XNz8lw5A3dQKsp2Rs82puL7!RNVJqQ3TwbZccavDzeUAl2BEuX6kgn05LMYrH+ne8oaoaVIH+lwyK1Ih0DDLastBCQ4XiQW3t8nx!QV1P7w4vdbP9jhdqPK/L
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/15/24 7:03 AM, The Natural Philosopher wrote:
> On 15/10/2024 07:31, 186282@ud0s4.net wrote:
>> On 10/14/24 6:16 AM, The Natural Philosopher wrote:
>
>>> I think that even if it does not work successfully it is great that
>>> people are thinking outside the box.
>>> Analogue computers could offer massive parallelism for simulating
>>> complex dynamic systems.
>>
>>
>>    Yea, but not much PRECISION beyond a stage or two
>>    of calx  :-)
>>
>>    No "perfect" fixes.
>
> As I said, let's say we are simulating airflow over  a fast moving
> object - now normally the fluid dynamics CFM is crap and it is cheaper
> and more accurate to throw it in a wind tunnel.

Very likely ... though I've never thrown anything into
a wind tunnel.

Analog still has a place. Until you go atomic it really
is a very analog universe.

In theory you can do "digitized analog" ... signal
levels that seem/act analog but are really finely
discrete digital values. This CAN minimize the
chain-calc accuracy problem.

> The wind tunnel is not measuiring data to any high accuracy but its
> using atomic level measurement cells in enormous quantities in parallel.
>
> The problem with CFM is you cant have too may 'cells' or you run out of
> computer power. Its a step beyond 3D modelling where the more triangles
> you have the closer to real everything looks, but its a similar problem .
>
> But a wind tunnel built out of analogue 'cells' might be quite simple in
> concept. Just large in silicon scale.
>
> And it wouldn't need to be 'programmed' as its internal logic would be
> constructed to be the equations that govern fluid dynamics. All you
> would then do is take a 3D surface and constrain every cell in that
> computer on that surface to have zero output.
>
> If I were a graduate again that's a PhD project that would appeal...

I've seen old analog computers - mostly aimed at finding
spring rates and such. Rs, caps, inductors ... you can
sim a somewhat complex mechanical system just by plugging
in modules. Real-time and adequately accurate. You can
fake it in digital now however ... but it's not as
beautiful/natural.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: 186282@ud0s4.net
Newsgroups: comp.os.linux.misc
Organization: wokiesux
Date: Wed, 16 Oct 2024 07:13 UTC
References: 1 2 3 4 5 6
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Wed, 16 Oct 2024 07:13:44 +0000
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Newsgroups: comp.os.linux.misc
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<ln7tk0FijhpU1@mid.individual.net> <vemgqn$1qp5j$3@dont-email.me>
<ln8n48FmbidU1@mid.individual.net>
From: 186283@ud0s4.net (186282@ud0s4.net)
Organization: wokiesux
Date: Wed, 16 Oct 2024 03:13:44 -0400
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <ln8n48FmbidU1@mid.individual.net>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 7bit
Message-ID: <zIycnQEfS_w1-pL6nZ2dnZfqn_QAAAAA@earthlink.com>
Lines: 21
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-KCTMEAZJ4SzQYygvNbOlIa+oCh5zKO7UdvERUMVpjHiNnk0yJI26XrjppdvnjetjWXO2ZVQxutX7vuA!bWxAGCdun0w46fGU0HeMGXZLSyhOpPchf/oil4lICukUQO2LZtW2UED49adV7+7SebnCFIvyljCe!Kc4gYkGkHswMbr3gqtXz
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/15/24 10:35 PM, rbowman wrote:
> On Tue, 15 Oct 2024 15:46:05 -0400, Chris Ahlstrom wrote:
>
>> The political polls state ranges, but nothing about the alpha, the N,
>> and,
>> most importantly, the wording of the poll questions and the nature of
>> the sampling.
>
> I try to ignore polls and most of the hype. A few years back I went to bed
> expecting Hillary Clinton to be the president elect when I woke up. The DJ
> on the radio station I listen to morning was a definite lefty. When he
> played Norah Jones' 'Carry On' I found I'd been mistaken.
>
> https://www.youtube.com/watch?v=DqA25Ug71Mc

Trump IS grating ... no question ... but K is just
an empty skull. That's been her JOB. Can't have
someone like that in times like these.

Not entirely sure of the Linux angle here though ...

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Pancho
Newsgroups: comp.os.linux.misc
Organization: A noiseless patient Spider
Date: Wed, 16 Oct 2024 07:23 UTC
References: 1 2 3 4 5
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Pancho.Jones@proton.me (Pancho)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Wed, 16 Oct 2024 08:23:45 +0100
Organization: A noiseless patient Spider
Lines: 31
Message-ID: <venpm2$24hhj$1@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<velibm$1m3bg$4@dont-email.me>
<VEGdnTMGMJLMwpL6nZ2dnZfqn_adnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 16 Oct 2024 09:23:46 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="51518dd64abd209e40f533920cab286d";
logging-data="2246195"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+ndJemL1nivJvl5WTNoz4aCcS5gHtk/TY="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:z7xTJeA8vuC/WTemaHGpR/YomK8=
In-Reply-To: <VEGdnTMGMJLMwpL6nZ2dnZfqn_adnZ2d@earthlink.com>
Content-Language: en-GB
View all headers

On 10/16/24 07:38, 186282@ud0s4.net wrote:
> On 10/15/24 7:06 AM, The Natural Philosopher wrote:
>> On 15/10/2024 07:43, 186282@ud0s4.net wrote:
>>> The question is how EXACT the precision HAS to be for
>>>    most "AI" uses. Might be safe to throw away a few
>>>    decimal points at the bottom.
>>
>> My thesis is that *in some applications*, more low quality
>> calculations bets a fewer high quality ones anyway.
>> I wasn't thinking of AI, as much as modelling complex turbulent flow
>> in aero and hydrodynamics or weather forecasting
>
>   Well, weather, any decimal points are BS anyway :-)
>
>   However, AI and fuzzy logic and neural networks - it
>   has just been standard practice to use floats to handle
>   all values. I've got books going back into the mid 80s
>   on all those and you JUST USED floats.
>
>   BUT ... as said, even a 32-bit int can handle fairly
>   large vals. Mult little vals by 100 or 1000 and you can
>   throw away the need for decimal points - and the POWER
>   required to do such calx. Accuracy should be more than
>   adequate.
>
>   In any case, I'm happy SOMEONE finally realized this.
>
>   TOOK a really LONG time though ......

AIUI, GPU/Cuda only offered 32 bit floats, no doubles. So I think people
always knew.

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Richard Kettlewell
Newsgroups: comp.os.linux.misc
Organization: terraraq NNTP server
Date: Wed, 16 Oct 2024 10:56 UTC
References: 1 2 3 4 5
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!news.gegeweb.eu!gegeweb.org!nntp.terraraq.uk!.POSTED.tunnel.sfere.anjou.terraraq.org.uk!not-for-mail
From: invalid@invalid.invalid (Richard Kettlewell)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
Date: Wed, 16 Oct 2024 11:56:52 +0100
Organization: terraraq NNTP server
Message-ID: <wwvplo071u3.fsf@LkoBDZeT.terraraq.uk>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<velibm$1m3bg$4@dont-email.me>
<VEGdnTMGMJLMwpL6nZ2dnZfqn_adnZ2d@earthlink.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Info: innmantic.terraraq.uk; posting-host="tunnel.sfere.anjou.terraraq.org.uk:172.17.207.6";
logging-data="52127"; mail-complaints-to="usenet@innmantic.terraraq.uk"
User-Agent: Gnus/5.13 (Gnus v5.13) Emacs/28.2 (gnu/linux)
Cancel-Lock: sha1:DKNAfjeLdmL9kCWOM8Yttov2peM=
X-Face: h[Hh-7npe<<b4/eW[]sat,I3O`t8A`(ej.H!F4\8|;ih)`7{@:A~/j1}gTt4e7-n*F?.Rl^
F<\{jehn7.KrO{!7=:(@J~]<.[{>v9!1<qZY,{EJxg6?Er4Y7Ng2\Ft>Z&W?r\c.!4DXH5PWpga"ha
+r0NzP?vnz:e/knOY)PI-
X-Boydie: NO
View all headers

"186282@ud0s4.net" <186283@ud0s4.net> writes:
> BUT ... as said, even a 32-bit int can handle fairly
> large vals. Mult little vals by 100 or 1000 and you can
> throw away the need for decimal points - and the POWER
> required to do such calx. Accuracy should be more than
> adequate.

You’re talking about fixed-point arithmetic, which is already used where
appropriate (although the scale is a power of 2 so you can shift
products down into the right place rather than dividing).

> In any case, I'm happy SOMEONE finally realized this.
>
> TOOK a really LONG time though ......

It’s obvious that you’ve not actually read or understood the paper that
this thread is about.

--
https://www.greenend.org.uk/rjk/

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Chris Ahlstrom
Newsgroups: comp.os.linux.misc
Organization: None
Date: Wed, 16 Oct 2024 11:40 UTC
References: 1 2 3 4 5 6 7
Path: eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: OFeem1987@teleworm.us (Chris Ahlstrom)
Newsgroups: comp.os.linux.misc
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
Date: Wed, 16 Oct 2024 07:40:46 -0400
Organization: None
Lines: 46
Message-ID: <veo8ot$27bqb$11@dont-email.me>
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<ln7tk0FijhpU1@mid.individual.net> <vemgqn$1qp5j$3@dont-email.me>
<ln8n48FmbidU1@mid.individual.net>
<zIycnQEfS_w1-pL6nZ2dnZfqn_QAAAAA@earthlink.com>
Reply-To: OFeem1987@teleworm.us
Injection-Date: Wed, 16 Oct 2024 13:41:17 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="1b4a00e8c8ed09b9c0ef97f98141eec1";
logging-data="2338635"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1835EeQ5R3LUcq2PoeRo1Ne"
User-Agent: slrn/1.0.3 (Linux)
Cancel-Lock: sha1:jfmspiVbUbXgn1FUZkbU0jK/nV4=
X-Mutt: The most widely-used MUA
X-Face: 63n<76,LYJQ2m#'5YL#.T95xqyPiG`ffIP70tN+j"(&@6(4l\7uL)2+/-r0)/9SjZ`qw=
Njn mr93Xrerx}aQG-Ap5IHn"xe;`5:pp"$RH>Kx_ngWw%c\+6qSg!q"41n2[.N/;Pu6q8?+Poz~e
A9? $6_R7cm.l!s8]yfv7x+-FYQ|/k
X-Slrn: Why use anything else?
X-User-Agent: Microsoft Outl00k, Usenet K00k Editions
View all headers

186282@ud0s4.net wrote this copyrighted missive and expects royalties:

> On 10/15/24 10:35 PM, rbowman wrote:
>> On Tue, 15 Oct 2024 15:46:05 -0400, Chris Ahlstrom wrote:
>>
>>> The political polls state ranges, but nothing about the alpha, the N,
>>> and,
>>> most importantly, the wording of the poll questions and the nature of
>>> the sampling.
>>
>> I try to ignore polls and most of the hype. A few years back I went to bed
>> expecting Hillary Clinton to be the president elect when I woke up. The DJ
>> on the radio station I listen to morning was a definite lefty. When he
>> played Norah Jones' 'Carry On' I found I'd been mistaken.
>>
>> https://www.youtube.com/watch?v=DqA25Ug71Mc
>
> Trump IS grating ... no question ... but K is just
> an empty skull. That's been her JOB. Can't have
> someone like that in times like these.

Trump's the empty skull. Well, it is full... of nonsense and bile.

> Not entirely sure of the Linux angle here though ...

Harris as VP was like Linux, working reliably in the background.

She's no empty skull. She was a prosecutor, a district attorney, a state
attorney general, a US senator, and the vice president. But some people cannot
stand that in a woman.

--
I began many years ago, as so many young men do, in searching for the
perfect woman. I believed that if I looked long enough, and hard enough,
I would find her and then I would be secure for life. Well, the years
and romances came and went, and I eventually ended up settling for someone
a lot less than my idea of perfection. But one day, after many years
together, I lay there on our bed recovering from a slight illness. My
wife was sitting on a chair next to the bed, humming softly and watching
the late afternoon sun filtering through the trees. The only sounds to
be heard elsewhere were the clock ticking, the kettle downstairs starting
to boil, and an occasional schoolchild passing beneath our window. And
as I looked up into my wife's now wrinkled face, but still warm and
twinkling eyes, I realized something about perfection... It comes only
with time.
-- James L. Collymore, "Perfect Woman"

Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most Floating-Point for Big Ints
From: Charlie Gibbs
Newsgroups: comp.os.linux.misc
Date: Wed, 16 Oct 2024 16:16 UTC
References: 1 2 3 4 5 6 7 8
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!peer03.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx14.iad.POSTED!not-for-mail
Newsgroups: comp.os.linux.misc
From: cgibbs@kltpzyxm.invalid (Charlie Gibbs)
Subject: Re: Well DUH ! AI People Finally Realize They Can Ditch Most
Floating-Point for Big Ints
References: <YPKdnaTfaLzAq5b6nZ2dnZfqnPidnZ2d@earthlink.com>
<veg8cq$k36i$1@dont-email.me>
<LpScnb7e54pgk5P6nZ2dnZfqn_qdnZ2d@earthlink.com>
<ln7tk0FijhpU1@mid.individual.net> <vemgqn$1qp5j$3@dont-email.me>
<ln8n48FmbidU1@mid.individual.net>
<zIycnQEfS_w1-pL6nZ2dnZfqn_QAAAAA@earthlink.com>
<veo8ot$27bqb$11@dont-email.me>
User-Agent: slrn/1.0.3 (Linux)
Lines: 36
Message-ID: <rFRPO.380392$FzW1.58506@fx14.iad>
X-Complaints-To: https://www.astraweb.com/aup
NNTP-Posting-Date: Wed, 16 Oct 2024 16:16:23 UTC
Date: Wed, 16 Oct 2024 16:16:23 GMT
X-Received-Bytes: 2605
View all headers

On 2024-10-16, Chris Ahlstrom <OFeem1987@teleworm.us> wrote:

> Harris as VP was like Linux, working reliably in the background.
>
> She's no empty skull. She was a prosecutor, a district attorney, a state
> attorney general, a US senator, and the vice president. But some people cannot
> stand that in a woman.

<applause>

> I began many years ago, as so many young men do, in searching for the
> perfect woman. I believed that if I looked long enough, and hard enough,
> I would find her and then I would be secure for life. Well, the years
> and romances came and went, and I eventually ended up settling for someone
> a lot less than my idea of perfection. But one day, after many years
> together, I lay there on our bed recovering from a slight illness. My
> wife was sitting on a chair next to the bed, humming softly and watching
> the late afternoon sun filtering through the trees. The only sounds to
> be heard elsewhere were the clock ticking, the kettle downstairs starting
> to boil, and an occasional schoolchild passing beneath our window. And
> as I looked up into my wife's now wrinkled face, but still warm and
> twinkling eyes, I realized something about perfection... It comes only
> with time.
> -- James L. Collymore, "Perfect Woman"

Beautiful. Here's Heinlein's take on it:

A man does not insist on physical beauty in a woman who
builds up his morale. After a while he realizes that
she _is_ beautiful - he just hadn't noticed it at first.

--
/~\ Charlie Gibbs | We'll go down in history as the
\ / <cgibbs@kltpzyxm.invalid> | first society that wouldn't save
X I'm really at ac.dekanfrus | itself because it wasn't cost-
/ \ if you read it the right way. | effective. -- Kurt Vonnegut

Pages:12

rocksolid light 0.9.8
clearnet tor