On 2023-02-24 03:52, Alan Cudmore wrote:
Hi all,
On Thu, Feb 23, 2023 at 4:00 PM Karel Gardas <karel@functional.vision> 
wrote:

    Hi Prakhar,

    On 2/23/23 20:23, Prakhar Agrawal wrote:
     > I completely agree with all your points, but my rationale for
     > introducing the jetson nano or jetson AGX orin was because of
    their GPU
     > power.

    it's really nice what Nvidia achieved here, right? Unfortunately this
    GPU potential is fully locked up by binary driver NVidia provides only
    for selected number of platforms --- if not just for the only one:
    Linux. So very questionable how you would unlock that on RTEMS during
    the limited time of GSoC. Just see what Nouveau folks are doing:
    https://nouveau.freedesktop.org/ <https://nouveau.freedesktop.org/>
    -- for years and they just barely got
    to 3D acceleration. Just clone their git repo, see number of patches,
    lines of code provided and number of people involved and I think you
    will get an idea how mamooth task this is...

     >
     > In the case of large hobby projects or maybe the initial days of a
     > startup(seed ones), a real-time system that can work with boards
    having
     > good GPU can do wonders.
     > For example, for an autonomous vehicle L2, L3 autonomy can be
    achieved
     > using a 60W Jetson AGX orin, hence if RTEMS support is added to the
     > board, it might help create an awesome system to handle all the
    critical
     > time constraints necessary for the vehicle and give it the
    ability to
     > coordinate a large number of concurrent activities.

    If you are interested in machine vision based on AI and robotics, why
    not to look around for more open-source friendly solution? Recently
    just
    found i.MX 8M Plus and their claimed 2.3 TOPS NPU. Certainly not that
    powerful like NVidia, but NXP is historically more friendly to 3rd
    party
    OSes. Not sure about NPU, have not had a time to investigate that yet,
    but perhaps you do?

    Also, with i.MX 8M Plus you still do have a chance to use AI Vision in
    non-real time manner running on top of Linux and run RTEMS real-time
    tasks on built in Cortex-M7 -- I mean if you decide that this
    particular
    BSP may be your GSoC. :-)

    
https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-8-applications-processors/i-mx-8m-plus-arm-cortex-a53-machine-learning-vision-multimedia-and-industrial-iot:IMX8MPLUS
 
<https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-8-applications-processors/i-mx-8m-plus-arm-cortex-a53-machine-learning-vision-multimedia-and-industrial-iot:IMX8MPLUS>

     >> Honestly I'd rather see a new BSP for a decent RISC-V board.
     >
     > I was reading about RISC-V and their comparison with ARM SBC and
    in one
     > blog I read this - "ARM processors have benefited from a lot more
     > research, funding, and development than RISC-V. This means that
    it can
     > be argued that RISC-V is being left behind"

    Do not worry about it. RISC-V is here and will stay. A lot was already
    invested into it and much more will still be...

I'm working on submitting a RISC-V BSP variant for the Kendryte K210 CPU. It's low cost and has a 1TOPS NPU. I don't think the NPU needs a binary driver, and it typically is used with FreeRTOS or bare metal. But I do like the idea of a dual CPU system where a linux/AI processor can work with a RTOS based MCU for real time tasks.
Systems where a Linux (or other desktop-like OS) runs on one core and 
RTEMS on another would be interesting for other cases too. For example 
for an industrial system where you can have a complex GUI and a real 
time part. We had some systems where we thought about implementing 
something like that.
Supply chain issues aside, I also am interested in the Pine64 0x64 and its multiple RISC-V CPUs. I also have been watching the VisionFive 2, which has a quad-core RISC-V CPU. The VisionFive 2 Linux support is still maturing, but it does have OpenSBI U-boot, so it might be possible to load RTEMS images over TFTP. https://www.kickstarter.com/projects/starfive/visionfive-2 <https://www.kickstarter.com/projects/starfive/visionfive-2>
https://wiki.pine64.org/wiki/Ox64 <https://wiki.pine64.org/wiki/Ox64>

For ARM based AI systems, what about the Beaglebone AI?
https://beagleboard.org/AI <https://beagleboard.org/AI>

But, maybe a GSOC sized project related to AI would be to integrate a library such as tensorflow lite or TinyMAIX:
https://github.com/sipeed/TinyMaix <https://github.com/sipeed/TinyMaix>
https://www.tensorflow.org/lite <https://www.tensorflow.org/lite>

They might work with the well supported RTEMS boards like the Beaglebone black.
Regards,
Alan

    Karel

    _______________________________________________
    devel mailing list
    devel@rtems.org <mailto:devel@rtems.org>
    http://lists.rtems.org/mailman/listinfo/devel
    <http://lists.rtems.org/mailman/listinfo/devel>

--
--------------------------------------------
embedded brains GmbH
Herr Christian MAUDERER
Dornierstr. 4
82178 Puchheim
Germany
email:  christian.maude...@embedded-brains.de
phone:  +49-89-18 94 741 - 18
mobile: +49-176-152 206 08

Registergericht: Amtsgericht München
Registernummer: HRB 157899
Vertretungsberechtigte Geschäftsführer: Peter Rasmussen, Thomas Dörfler
Unsere Datenschutzerklärung finden Sie hier:
https://embedded-brains.de/datenschutzerklaerung/
_______________________________________________
devel mailing list
devel@rtems.org
http://lists.rtems.org/mailman/listinfo/devel

Reply via email to