Nvidia Jetson AGX Xavier Developer Event: AI-Powered Robots Abound

We visited Nvidia's $370 million Endeavor headquarters to attend the company's Jetson Developer Event.

Nvidia Jetson AGX Xavier Developer Event: AI-Powered Robots Abound : Read more
11 answers Last reply
More about nvidia jetson agx xavier developer event powered robots abound
  1. PaulAlcorn said:
    We visited Nvidia's $370 million Endeavor headquarters in Santa Clara, California, to attend the company's Jetson Developer Event. The sprawling facility features an open floor plan that houses 2,500 employees.

    ...and people wonder why their graphics cards cost so much.
  2. BTW, Magic Leap is using a TX2 with reduced CPU cores.
  3. Note that the AIs are application specific at this time too... and still somewhat limited. Even the more advanced ones where human interaction takes place. We're still a ways away from having our own "Rosie" or other robot servants that can do various different and unrelated tasks.
  4. 1st jetson is TK1 not TX1
  5. Most edge computing only runs a pre-trained model. You won't find Xavier limited so much to specific applications so much as the models needing to be trained for specific tasks on non-Xavier systems before deploy to Xavier. Right now Xavier is quite capable of running most any neural net on camera or sensor data without effort...provided it is using a pre-trained network. "Rosie" is more about the pre-trained models than about the edge computing system itself.
  6. shrapnel_indie said:
    Note that the AIs are application specific at this time too... and still somewhat limited. Even the more advanced ones where human interaction takes place. We're still a ways away from having our own "Rosie" or other robot servants that can do various different and unrelated tasks.

    With 5G, you could have an Alexa-type cloud backend and the robot could download pre-trained models to perform various tasks, on an as needed basis.

    The data it collects while performing the task could be fed back to the cloud, for further training and refinement of the model.

    Where 5G comes into the picture is providing the connectivity and bandwidth needed to download these big models and upload the additional training data. The robot could cache frequently-used models, so it doesn't have to pause before starting each task.
  7. I keep hearing about these things, but never saw an article what Jetson is about. One of my friends said he uses the previous generation of this thing to operate security cameras?
  8. jn77 said:
    I keep hearing about these things, but never saw an article what Jetson is about. One of my friends said he uses the previous generation of this thing to operate security cameras?

    It's their embedded platform. They provide both compute modules and developer kits that are essentially PC-like. Some people are just using them as small, ARM-based Linux workstations. The GPU of TX2 is comparable to a GTX 1030.

    https://www.nvidia.com/en-us/autonomous-machines/embedded-systems-dev-kits-modules/#jetsonDevkits

    To give you an idea of the horsepower, the TX1 is essentially what's used in Nintendo Switch. The TX2 is up to 50% faster, but Xavier is a rather different, AI-focused animal. Don't expect to see a Xavier-based games console. All of their superlatives are referring to its AI performance (provided by the addition of Tensor cores). Because of this, it's much bigger and much more expensive.
  9. jn77 said:
    I keep hearing about these things, but never saw an article what Jetson is about. One of my friends said he uses the previous generation of this thing to operate security cameras?


    NVIDIA has produced a "Tegra" series of System on a Chip (SoC) for a very long time. These have ARM embedded CPUs and most everything needed to implement an entire operating system, short of the power delivery and connectors. The Jetson series are the Tegra boards which first had a GPU added as well. Previously there was no embedded system with a GPU, and thus no embedded system could have done any kind of neural network real time solving.

    The TK1 is their first version, the Jetson is the TK1 with a carrier board sold together as a developer system. It has the Kepler series GPU.

    The TX1 is the first 64-bit Tegra, and the Jetson TX1 is a module with a developer carrier board (think of the carrier as similar to a motherboard, but carriers do not implement everything a PC motherboard would implement...much of that is within the SoC itself). This has Maxwell architecture GPU.

    The TX2 is the second 64-bit version, and has the exact same module layout as the TX1, but it is much faster and uses a Pascal series GPU. The carrier board for their "development" kit is the same as the TX1, and you can call the module or module plus carrier one of the Jetson series.

    Xavier is their most recent version, the layout has changed, and although being much smaller than the TX1/TX2 it is scorching fast in comparison (and TX2 was already rather fast). This uses a Volta architecture. The developer kit with a carrier is tiny. Most of the weight is in the heat sink, but there isn't much this can't do.
    https://elinux.org/Jetson_AGX_Xavier

    Note that the 32-bit TK1 is what is known as armhf (ARMv7-a), and that 64-bit is arm64/aarch64 (ARMv8-a).
  10. LinuxDevice said:
    NVIDIA has produced a "Tegra" series of System on a Chip (SoC) for a very long time. These have ARM embedded CPUs and most everything needed to implement an entire operating system, short of the power delivery and connectors. The Jetson series are the Tegra boards which first had a GPU added as well. Previously there was no embedded system with a GPU, and thus no embedded system could have done any kind of neural network real time solving.

    Tegra SoC's always had a GPU and ARM core(s).

    https://en.wikipedia.org/wiki/Tegra

    Jetson is the embedded platform they made to make it easier for people to use their Tegra chips.

    I think Odroid actually had embedded linux boards with GPUs for several years before Jetson (and technically, the Raspberry Pi has a GPU - though it's not very strong or programmable). But, as you point out, the Jetson modules are a popular option for performance-intensive embedded applications.
  11. bit_user said:
    LinuxDevice said:
    NVIDIA has produced a "Tegra" series of System on a Chip (SoC) for a very long time. These have ARM embedded CPUs and most everything needed to implement an entire operating system, short of the power delivery and connectors. The Jetson series are the Tegra boards which first had a GPU added as well. Previously there was no embedded system with a GPU, and thus no embedded system could have done any kind of neural network real time solving.

    Tegra SoC's always had a GPU and ARM core(s).

    https://en.wikipedia.org/wiki/Tegra

    Jetson is the embedded platform they made to make it easier for people to use their Tegra chips.

    I think Odroid actually had embedded linux boards with GPUs for several years before Jetson (and technically, the Raspberry Pi has a GPU - though it's not very strong or programmable). But, as you point out, the Jetson modules are a popular option for performance-intensive embedded applications.


    Sorry, I should say it is the first with CUDA allowing the core to be used with neural networks.
Ask a new question

Read More

Graphics Nvidia