Connect with us
MARE BALTICUM Gaming & TECH Summit 2024

Artificial Intelligence

El Segundo Selected as Host Site for Artificial Intelligence Bootcamp – APPLICATIONS NOW OPEN!

Published

on

<!– Name:DistributionId Value:8828349 –> <!– Name:EnableQuoteCarouselOnPnr Value:False –> <!– Name:IcbCode Value:9500 –> <!– Name:CustomerId Value:1261213 –> <!– Name:HasMediaSnippet Value:false –> <!– Name:AnalyticsTrackingId Value:f79921e7-6010-4442-a993-c00ae546eafe –>

EL SEGUNDO, Calif., April 28, 2023 (GLOBE NEWSWIRE) — Notified will host a Mark Cuban Foundation Artificial Intelligence (AI) Bootcamp for high school students at no-cost in the Fall of 2023. The AI Bootcamp in El Segundo will be targeted at underserved high school students (9th-12th grade) and will introduce these high school students to basic AI concepts and skills.

Notified is one of 28+ host companies selected to host camps across the US in Fall 2023.

The Notified Bootcamp will be held over four consecutive Saturdays starting on October 14th and ending on November 4th .The bootcamp will run each Saturday from 11-3pm PT and if accepted, high school students must commit to attending all 4 sessions.

The student and parent application are now open at markcubanai.org/notifiedcapr. Students do not need any prior experience with computer science, programming, or robotics to apply and attend.

With AI being a relevant topic on many news sources, students will learn what artificial intelligence is and is not, where they already interact with AI in their own lives, and the ethical implications of AI systems including but not limited to TikTok recommendations, smart home assistants, facial recognition, and self-driving cars to name a few. Participants will also learn how Large Language Models like ChatGPT are changing life as we know it by answering questions, telling original stories, and even writing computer code.

Students will benefit from volunteer corporate mentor instructors who are knowledgeable about AI, ML and data science and able to help students quickly understand material normally taught at a collegiate level. As part of the 4-hour curriculum, students will work with open source tools each day to build their own AI applications related to Computer Vision, Machine Learning, Natural Language Processing and Generative AI.

The Mark Cuban Foundation provides the bootcamp’s curriculum materials, trains corporate volunteer mentors, and recruits and scores applications for local student selected to attend camp. In addition, the Mark Cuban Foundation and Notified work together to provide food, transportation, and access to laptops for students at no-cost throughout the duration of Bootcamp.

“It was a lot of fun, I learned things I didn’t even know were possible with A.I. and their real-world applications showed me just how much it will change our world.” – Brandon B., 10th Grade, 2022 AI Bootcamp Participant

Founded by Mark Cuban in 2019, the AI Bootcamp initiative has hosted no-cost AI bootcamps for students across several US cities, including Dallas, Chicago, Pittsburgh, Detroit, and Atlantic City to name a few. The Mark Cuban Foundation has impacted 900+ students to date and has a goal to increase that number year over year.

Students interested in applying to the Mark Cuban Foundation AI Bootcamp should do so before Friday, September 8th, 2023 at markcubanai.org/notifiedcapr. To see our 2023 camp locations and to learn more about the Mark Cuban Foundation AI Bootcamps, please visit markcubanai.org/faq.

Contact: Carli Lidiak, Mark Cuban Foundation
Phone: 309-840-0348
Email: [email protected] 

 

GlobeNewswire is one of the world's largest newswire distribution networks, specializing in the delivery of corporate press releases financial disclosures and multimedia content to the media, investment community, individual investors and the general public.

Artificial Intelligence

Supermicro’s Rack Scale Liquid-Cooled Solutions with the Industry’s Latest Accelerators Target AI and HPC Convergence

Published

on

supermicro’s-rack-scale-liquid-cooled-solutions-with-the-industry’s-latest-accelerators-target-ai-and-hpc-convergence

Complete Data Center Liquid-Cooled Solutions Enable AI Factories to be Constructed at Unprecedented Speeds Using the Latest Dense GPU Servers Equipped with the Highest-Performing CPUs and GPUs
SAN JOSE, Calif. and HAMBURG, Germany, May 13, 2024 /PRNewswire/ — International Supercomputing Conference (ISC) — Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is addressing the most demanding requirements from customers who want to expand their AI and HPC capacities while reducing data center power requirements. Supermicro delivers complete liquid-cooled solutions, including cold plates, CDUs, CDMs, and entire cooling towers. A significant reduction in the PUE of a data center is quickly realized with data center liquid-cooled servers and infrastructure, and this can reduce overall power consumption in the data center by up to 40%.

“Supermicro continues to work with our AI and HPC customers to bring the latest technology, including total liquid cooling solutions, into their data centers,” said Charles Liang, president and CEO of Supermicro. “Our complete liquid cooling solutions can handle up to 100 kW per rack, which reduces the TCO in data centers and allows for denser AI and HPC computing. Our building block architecture allows us to bring the latest GPUs and accelerators to market, and with our trusted suppliers, we continue to bring new rack-scale solutions to the market that ship to customers with a reduced time to delivery.”
Supermicro application-optimized high-performance servers are designed to accommodate the most performant CPUs and GPUs for simulation, data analytics, and machine learning. The Supermicro 4U 8-GPU liquid-cooled server is in a class by itself, delivering petaflops of AI computing power in a dense form factor with the NVIDIA H100/H200 HGX GPUs. Supermicro will soon ship liquid-cooled Supermicro X14 SuperBlade in 8U and 6U configurations, the rackmount X14 Hyper, and the Supermicro X14 BigTwin. Several HPC-optimized server platforms will support the Intel Xeon 6900 with P-cores in a compact, multi-node form factor.
Learn more about how Supermicro Rack Scale Integration Services enable you to reduce costs and optimize your data center: https://www.supermicro.com/en/solutions/rack-integration 
In addition, Supermicro continues its leadership shipping the broadest portfolio of liquid cooled MGX Products in the industry.. Supermicro also confirms its support for delivering the latest accelerators from Intel with its new Intel® Gaudi® 3 accelerator and AMD’s MI300X accelerators. With up to 120 nodes per rack with the Supermicro SuperBlade®, large-scale HPC applications can be executed in just a few racks. Supermicro will display a wide range of servers at the International Supercomputing Conference, including Supermicro X14 systems incorporating the Intel® Xeon® 6 processors.
Supermicro will also showcase and demonstrate a wide range of solutions designed specifically for HPC and AI environments at ISC 2024. The new 4U 8-GPU liquid-cooled servers with NVIDIA HGX H100 and H200 GPUs highlight the Supermicro lineup. These servers and others will support the NVIDIA B200 HGX GPUs when available. New systems with high-end GPUs accelerate AI training and HPC simulation by bringing more data closer to the GPU than previous generations by using high-speed HBM3 memory. With the incredible density of the 4U liquid-cooled servers, a single rack delivers (8 servers x 8 GPUs x 1979 Tflops FP16 (with sparsity) = 126+ petaflops.  The Supermicro SYS-421GE-TNHR2-LCC can use dual 4th or 5th Gen Intel Xeon processors, and the AS -4125GS-TNHR2-LCC is available with dual 4th Gen AMD EPYC™ CPUs.
The new AS -8125GS-TNMR2 server gives users access to 8 AMD Instinct™ MI300X accelerators. This system also includes dual AMD EPYC™ 9004 Series Processors with up to 128 cores/256 threads and up to 6TB memory. Each AMD Instinct MI300X accelerator contains 192GB of HBM3 memory per GPU, all connected with an AMD Universal Base Board (UBB 2.0). Moreover, the new AS -2145GH-TNMR-LCC and AS -4145GH-TNMR APU servers are targeted to accelerate HPC workloads with the MI300A APU. Each APU combines high-performance AMD CPU, GPU, and HBM3 memory for 912 AMD CDNA™ 3 GPU compute units, 96 “Zen 4” cores, and 512GB of unified HBM3 memory in a single system.
At ISC 2024, a Supermicro 8U server with the Intel Gaudi 3 AI Accelerator will be shown. This new system is designed for AI training & Inferencing and can be directly networked with a traditional Ethernet fabric. Twenty-four 200 gigabit (Gb) Ethernet ports are integrated into every Intel Gaudi 3 accelerator, providing flexible and open-standard networking. In addition, 128GB of HBM2e high-speed memory is included. The Intel Gaudi 3 accelerator is designed to scale up and scale out efficiently from a single node to thousands to meet the expansive requirements of GenAI models. Supermicro’s Petascale storage systems, which are critical for large-scale HPC and AI workloads, will also be displayed.
The Supermicro SuperCloud Composer will be demonstrated for the data center management software, showing how, from a single console, an entire data center can be monitored and managed, including the status of all liquid-cooled servers.
Learn more about Supermicro’s presence at ISC 2024 at: https://app.swapcard.com/event/isc-high-performance-2024/exhibitor/RXhoaWJpdG9yXzE1NjYyODE=?
About Super Micro Computer, Inc.
Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first to market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are a Total IT Solutions manufacturer with server, AI, storage, IoT, switch systems, software, and support services. Supermicro’s motherboard, power, and chassis design expertise further enable our development and production, enabling next generation innovation from cloud to edge for our global customers. Our products are designed and manufactured in-house (in the US, Taiwan, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions® allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling).
Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.
All other brands, names, and trademarks are the property of their respective owners.
SMCI-F
Photo – https://mma.prnewswire.com/media/2410556/Supermicro_AI_HPC_Systems.jpg
Logo – https://mma.prnewswire.com/media/1443241/Supermicro_Logo.jpg

View original content:https://www.prnewswire.co.uk/news-releases/supermicros-rack-scale-liquid-cooled-solutions-with-the-industrys-latest-accelerators-target-ai-and-hpc-convergence-302143091.html

Continue Reading

Artificial Intelligence

The Future of AI in Retail Banking Operations

Published

on

the-future-of-ai-in-retail-banking-operations

LONDON, May 13, 2024 /PRNewswire/ — Artificial intelligence (AI) has emerged as a key focus for retail banking operations in recent months. Auriemma Group’s latest series of roundtables tackled the ever-growing space, discussing current usage of AI solutions and how members plan to utilize AI to further enhance their offerings.

“There’s a significant amount of interest in AI across all operational areas,” says Nicole Toussaint, Senior Manager of Industry Roundtables at Auriemma. “Although we’re seeing some hesitation in deploying it, I expect we’ll see some big moves by industry players over the coming months.”
Across all operational areas, firms are hoping to leverage AI to assist front-line employees with navigating their knowledge management systems. This would hopefully improve the accuracy of agent work and help to reduce lengthy training periods. Additionally, firms are considering using the tool enterprise-wide to produce meeting minutes and to assist in communications drafting.
In the Collections and Recoveries space, firms hope to use the tool in their contact strategies. Roundtable members believe that AI can make their contact more effective by creating more bespoke strategies. The technology can help firms decide the optimal time to contact customers, the most effective channel to make contact, and the most engaging messaging and content to use.
Fraud Departments see an opportunity to use AI to improve their current fraud detection models to identify bad actors and fraudulent payments more quickly. Several firms have already partnered with vendors who provide AI-powered fraud mitigation tools.
On the Servicing side, firms are discussing improvements to their chatbot offerings. Currently, many chatbots are FAQ-based, but firms believe that AI can revolutionise the chatbot experience and improve satisfaction scores in the channel.
“We see a clear opportunity to leverage AI to provide dynamic call scripting to front-line agents,” says Toussant. “This would take some of the pressure off agents when servicing and allow banks to provide a more tailored, well-informed customer experience.”
In the Disputes and Chargebacks space, one firm is already using an AI-integrated optical character resolution (OCR) tool to read customer documents provided as proof in a disputes case. Many firms hope to use AI to gain efficiencies in the disputes process. They believe that the tool can help guide agent decisions as they work cases by pulling in bank, scheme, and regulatory policies.
Similarly, in the Complaints space, members see an opportunity for AI to help complaints agents while investigating. AI can not only analyse the materials provided by the complainant but also bring in insights from previously decisioned cases and Financial Ombudsman (FOS) decisions. In this process, the technology can also guide the agent in classifying the complaint type. Some also noted that AI could potentially assist agents with drafting final response letters.
This topic is expected to become an evergreen topic at Auriemma Roundtable meetings, especially as firms identify new use cases and as Auriemma brings in experts from the field to drive further thought leadership.
The next set of roundtable meetings are scheduled for June and July at the Edwardian Hotel in Manchester. If you or any of your colleagues are interested in attending as our guests, please contact us via [email protected].
About Auriemma Group
For 40 years, Auriemma’s mission has been to empower clients with authoritative data and actionable insights. Our team comprises recognised experts in four primary areas: operational effectiveness, consumer research, co-brand partnerships and corporate finance. Our business intelligence and advisory services give clients access to the data, expertise and tools they need to navigate an increasingly complex environment and maximise their performance. Auriemma serves the consumer financial services ecosystem from our offices in London and New York City. For more information, visit us at www.auriemma.group.

View original content:https://www.prnewswire.co.uk/news-releases/the-future-of-ai-in-retail-banking-operations-302141978.html

Continue Reading

Artificial Intelligence

MSI Highlights Optimized AI Platforms to Accelerate Compute-Intensive Applications at ISC 2024

Published

on

msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024

HAMBURG, Germany, May 13, 2024 /PRNewswire/ — MSI, a leading global server provider, brings its latest server platforms powered by AMD processors and 5th Gen Intel® Xeon® Scalable processors, optimized for HPC and AI markets at ISC 2024, booth #F39 in Hamburg, Germany from May 13-15.

“As businesses increasingly adopt AI applications to improve customer experiences, the need for greater computing power and denser deployments has spurred significant shifts in IT infrastructure, driving a widespread adoption of liquid cooling,” said Danny Hsu, General Manager of Enterprise Platform Solutions. “MSI’s AI server platforms empower businesses to achieve efficiency gains while handling more compute-intensive workloads.”
Diverse GPU Platforms to Enhance Performance for AI Workloads
MSI G4201 is a 4U supercomputer designed for exceptional performance in compute-intensive tasks. It features up to eight double-wide PCIe 5.0 x16 slots optimized for high-performance GPU cards, alongside one single-wide PCIe 5.0 x16 expansion slot. Each GPU has a full PCIe 4.0 or 5.0 x16 link directly to the root port complex of a CPU socket without going through a PCIe switch, granting maximum CPU-to-GPU bandwidth. Powered by dual 5th Gen Intel Xeon Scalable Processors and equipped with 32 DDR5 DIMMs, the G4201 platform delivers outstanding heterogeneous computing capabilities for various GPU-based scientific high-performance computing, Generative AI, and inference applications. Additionally, the system includes twelve 3.5-inch drive bays for enhanced functionality.
The G4101 is a 4U 4GPU server platform designed for AI training workloads. It supports a single AMD EPYC™ 9004 Series processor equipped with a liquid cooling module, along with twelve DDR5 RDIMM slots. Additionally, it features four PCIe 5.0 x16 slots tailored for triple-slot graphic cards with coolers, ensuring increased airflow and sustained performance. With twelve front 2.5-inch U.2 NVMe/SATA drive bays, it offers high-speed and flexible storage options, catering to the diverse needs of AI workloads. The G4101 combines air flow spacing and liquid closed-loop cooling, making it the optimal thermal management solution for even the most demanding tasks.
For small-sized businesses, the liquid-cooled S1102-02 server offers an ideal solution, providing superior thermal performance while optimizing costs. Equipped with a single AMD Ryzen™ 7000 Series processor with liquid cooling support of up to 170W, the system features four DDR5 DIMM slots, one PCIe 4.0 slot, two 10GbE onboard Ethernet ports, and four 3.5-inch SATA hot-swappable drive bays in a compact 1U configuration.
Photo – https://mma.prnewswire.com/media/2407458/MSI_s_Diverse_AI_Server_Platforms_Empower_Businesses_for_Enhanced_Efficiency_and_Performance.jpg

View original content:https://www.prnewswire.co.uk/news-releases/msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024-302139397.html

Continue Reading

Trending