Rocksolid Light

News from da outaworlds

mail  files  register  groups  login

Message-ID:  

BOFH excuse #407: Route flapping at the NAP.


comp / comp.misc / Re: Neural Networks (MNIST inference) on the “3-cent” Microcontroller

SubjectAuthor
* Neural Networks (MNIST inference) on the “3-cent” MicrocontrollerD. Ray
`* Re: Neural Networks (MNIST inference) on the “3-cent” Microcontrollerolcott
 `- Re: Neural Networks (MNIST inference) on the “3-cent” MicrocontrollerD. Ray

1
Subject: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
From: D. Ray
Newsgroups: comp.misc, comp.ai.philosophy, alt.microcontrollers, comp.arch.embedded, alt.microcontrollers.8bit
Organization: Usenet.Farm
Date: Mon, 21 Oct 2024 20:06 UTC
Organization: Usenet.Farm
Cancel-Lock: sha1:O309nFECNS/s0WOhB/SdHgufosY=
Newsgroups: comp.misc,comp.ai.philosophy,alt.microcontrollers,comp.arch.embedded,alt.microcontrollers.8bit
Content-Type: text/plain; charset=UTF-8
Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!news.gegeweb.eu!gegeweb.org!usenet-fr.net!feeder1-2.proxad.net!proxad.net!feeder1-1.proxad.net!193.141.40.65.MISMATCH!npeer.as286.net!npeer-ng0.as286.net!peer01.ams1!peer.ams1.xlned.com!news.xlned.com!peer01.ams4!peer.am4.highwinds-media.com!news.highwinds-media.com!feeder3.usenet.farm!feeder4.usenet.farm!feed.usenet.farm!news.usenet.farm
Message-Id: <GDBnsQalznyGTjeHEuVJnPScMPMUvCQe@news.usenet.farm>
Mime-Version: 1.0
From: d@ray (D. Ray)
Date: Mon, 21 Oct 24 20:06:28 UTC
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Subject: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
Content-Transfer-Encoding: 8bit
X-Ufhash: qMeUtdiUQCJRqvMoBs5LRy4dlkhqxIPQ6as90VlPQp0iI3KCnB2OVsmdAMkCOqjGcPsOATDP5bFXUXVS%2FRSIW4iKpOo9apfOq5vRmFJYjtaDbiNLfuQirDM%2BMVxxtF4s8ae5vRNSCV3%2F2qif6k5rweG4nHhjaIKDeBxQttJBQQSCo%2BX9SmvMKN8FD6uLcCd63iViTfJY5eIXz4pb6sCSJePA8est6aBzcA%3D%3D
X-Received-Bytes: 2087
View all headers

Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.

Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?

<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

<https://archive.md/DzqzL>

Subject: Re: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
From: olcott
Newsgroups: comp.misc, comp.ai.philosophy, alt.microcontrollers, comp.arch.embedded, alt.microcontrollers.8bit
Date: Sun, 27 Oct 2024 01:43 UTC
References: 1
Path: eternal-september.org!news.eternal-september.org!feeder2.eternal-september.org!border-1.nntp.ord.giganews.com!nntp.giganews.com!local-1.nntp.ord.giganews.com!local-2.nntp.ord.giganews.com!Xl.tags.giganews.com!local-3.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 27 Oct 2024 01:43:01 +0000
Date: Sat, 26 Oct 2024 20:43:01 -0500
MIME-Version: 1.0
User-Agent: Mozilla Thunderbird
Subject: Re:_Neural_Networks_(MNIST_inference)_on_the_
“3-cent”_Microcontroller
Newsgroups: comp.misc,comp.ai.philosophy,alt.microcontrollers,comp.arch.embedded,alt.microcontrollers.8bit
References: <GDBnsQalznyGTjeHEuVJnPScMPMUvCQe@news.usenet.farm>
Content-Language: en-US
From: NoOne@NoWhere.com (olcott)
In-Reply-To: <GDBnsQalznyGTjeHEuVJnPScMPMUvCQe@news.usenet.farm>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Antivirus: Norton (VPS 241026-4, 10/26/2024), Outbound message
X-Antivirus-Status: Clean
Message-ID: <mImdnbief8Y4B4D6nZ2dnZfqlJydnZ2d@giganews.com>
Lines: 32
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-Ev5XV0XuM1CcwMnzeEubNgHESZ944Y1eJH8A+Ifa60frEh44E3L5Tq4BVw9P+ijRKtzaLaEGqJpKr/C!/bEaEC53r4jZeSpTBdy0F1nrieX0/ntJnRZPffLSVWxJXNTZSlNE/Lje2BNKtNDF98xq3DqOUtM=
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
View all headers

On 10/21/2024 3:06 PM, D. Ray wrote:
> Bouyed by the surprisingly good performance of neural networks with
> quantization aware training on the CH32V003, I wondered how far this can be
> pushed. How much can we compress a neural network while still achieving
> good test accuracy on the MNIST dataset? When it comes to absolutely
> low-end microcontrollers, there is hardly a more compelling target than the
> Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
> simplest and lowest cost applications there are. The smallest device of the
> portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
> memory and 64 bytes of ram, more than an order of magnitude smaller than
> the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
> architecture, as opposed to a much more powerful RISC-V instruction set.
>
> Is it possible to implement an MNIST inference engine, which can classify
> handwritten numbers, also on a PMS150C?
>
> …
>
> …
>
> <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
>
> <https://archive.md/DzqzL>

test to see if this posts or I should dump this paid provider.

--
Copyright 2024 Olcott

"Talent hits a target no one else can hit;
Genius hits a target no one else can see."
Arthur Schopenhauer

Subject: Re: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
From: D. Ray
Newsgroups: alt.microcontrollers, comp.misc, comp.ai.philosophy, comp.arch.embedded, alt.microcontrollers.8bit
Organization: Usenet.Farm
Date: Mon, 28 Oct 2024 15:42 UTC
References: 1 2
Content-Type: text/plain; charset=UTF-8
Mime-Version: 1.0
Newsgroups: alt.microcontrollers,comp.misc,comp.ai.philosophy,comp.arch.embedded,alt.microcontrollers.8bit
References: <GDBnsQalznyGTjeHEuVJnPScMPMUvCQe@news.usenet.farm> <mImdnbief8Y4B4D6nZ2dnZfqlJydnZ2d@giganews.com>
X-Ufhash: 2ChdM66Hfg7D0k1W0upfMBDW8WnYVnR60IBJh%2FW5C5Bs9nFxD28%2BbyhoUpM2TE3XCR4daeYE8S%2FcM6ZonDwLTRafgTdAsNDHJZapAhCzFMDwf6NDDxfoZJHhDhJvHgaSldhsrtITqkSK3UB5ZeWpihLEDvkPmM7GwAkr07IrF7vNKBSOo6JAHsoz7t9R5GIak91153AQV%2B0dzG%2BhW6n7Cahp%2B3FtjtLLMOfjxQ%3D%3D
Message-Id: <LOjNgCJcASWclmYmpQnsmqwYvLLubApO@news.usenet.farm>
Organization: Usenet.Farm
From: d@ray (D. Ray)
Subject: Re: Neural Networks (MNIST inference) on the “3-cent” Microcontroller
Cancel-Lock: sha1:O309nFECNS/s0WOhB/SdHgufosY=
Content-Transfer-Encoding: 8bit
Date: Mon, 28 Oct 24 15:42:41 UTC
Path: eternal-september.org!news.eternal-september.org!feeder2.eternal-september.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!feed.abavia.com!abe006.abavia.com!abe001.abavia.com!feeder3.usenet.farm!feeder4.usenet.farm!feed.usenet.farm!news.usenet.farm
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
View all headers

olcott <NoOne@NoWhere.com> wrote:
> On 10/21/2024 3:06 PM, D. Ray wrote:
>> Bouyed by the surprisingly good performance of neural networks with
>> quantization aware training on the CH32V003, I wondered how far this can be
>> pushed. How much can we compress a neural network while still achieving
>> good test accuracy on the MNIST dataset? When it comes to absolutely
>> low-end microcontrollers, there is hardly a more compelling target than the
>> Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
>> simplest and lowest cost applications there are. The smallest device of the
>> portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
>> memory and 64 bytes of ram, more than an order of magnitude smaller than
>> the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
>> architecture, as opposed to a much more powerful RISC-V instruction set.
>>
>> Is it possible to implement an MNIST inference engine, which can classify
>> handwritten numbers, also on a PMS150C?
>>
>> …
>>
>> …
>>
>> <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
>>
>> <https://archive.md/DzqzL>
>
> test to see if this posts or I should dump this paid provider.

It worked.

1

rocksolid light 0.9.8
clearnet tor