Connect with us
MARE BALTICUM Gaming & TECH Summit 2024

Artificial Intelligence

CDPQ invests in QIMA, a leading provider of supply chain compliance solutions

Published

on

 

Caisse de dépôt et placement du Québec (CDPQ), a global investment group, today announced the acquisition of a significant minority interest in QIMA, a leading provider of supply chain compliance solutions, alongside QIMA’s founder and management. This investment will allow the company to continue driving its strategic growth plan focused on both acquisitions and the expansion of its service offerings into new geographies and sectors. The transaction is subject to customary regulatory approvals.

Founded in 2005, QIMA is a fast-growing global Testing, Inspection and Certification (TIC) company and a digital pioneer in the sector that has invested heavily in developing an industry-leading technology platform. The company is active in the food, consumer goods and life sciences markets with over 15,000 clients in over 120 countries. QIMA has a broad global presence with more than 4,000 employees in 88 countries, as well as offices and specialized laboratories located around the world. The company has further developed a differentiated value proposition through its own supply chain quality management SaaS platform, QIMAone, which facilitates transparency and collaboration by enabling customers to have real time visibility of their entire procurement ecosystem and a shared view of quality and compliance performance.

“We are thrilled to welcome CDPQ and begin a new chapter in the development of QIMA,” said Sebastien Breteau, founder and CEO at QIMA. “As consumers’ expectations surrounding quality, safety, and environmental impact continue to rise, widespread disruptions are simultaneously making global supply chain management more complex than ever. With the support of CDPQ, QIMA is in a unique position to help. By combining an industry-leading technology platform with our global experts’ presence on the ground, we’ll continue to bring more transparency and traceability to the products consumers are using every day.”

“QIMA has enjoyed significant growth thanks to its superior level of digitalization and ability to successfully integrate numerous acquisitions over the last few years while also continuing to serve a growing base of blue-chip clients,” said Martin Laguerre, Executive Vice-President and Head of Private Equity and Capital Solutions at CDPQ. “With this investment, we look forward to partnering with QIMA and its strong management team as they continue to support a broad range of clients across the world with fast, accurate and transparent data that is essential to ensuring quality products that improve consumer safety and confidence.”

“We are delighted to partner with QIMA and to support the company in its next phase of growth,” said Albrecht von Alvensleben, Managing Director and Head of Private Equity Europe at CDPQ London. “QIMA’s entrepreneurial, customer-centric culture, combined with its proven ability to continuously reinvent itself, are perfectly aligned with CDPQ’s ambition to create sustainable, long-term value.”

Linklaters LLP served as legal counsel, BCG as commercial advisor, and PwC as finance and tax advisor to CDPQ. SVZ served as legal counsel, EY-Parthenon as commercial and IT advisor, Accuracy as finance advisor, and DLA Piper as tax and legal advisor to QIMA. Oloryn Partners served as advisor to QIMA’s management.

Artificial Intelligence

Supermicro’s Rack Scale Liquid-Cooled Solutions with the Industry’s Latest Accelerators Target AI and HPC Convergence

Published

on

supermicro’s-rack-scale-liquid-cooled-solutions-with-the-industry’s-latest-accelerators-target-ai-and-hpc-convergence

Complete Data Center Liquid-Cooled Solutions Enable AI Factories to be Constructed at Unprecedented Speeds Using the Latest Dense GPU Servers Equipped with the Highest-Performing CPUs and GPUs
SAN JOSE, Calif. and HAMBURG, Germany, May 13, 2024 /PRNewswire/ — International Supercomputing Conference (ISC) — Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is addressing the most demanding requirements from customers who want to expand their AI and HPC capacities while reducing data center power requirements. Supermicro delivers complete liquid-cooled solutions, including cold plates, CDUs, CDMs, and entire cooling towers. A significant reduction in the PUE of a data center is quickly realized with data center liquid-cooled servers and infrastructure, and this can reduce overall power consumption in the data center by up to 40%.

“Supermicro continues to work with our AI and HPC customers to bring the latest technology, including total liquid cooling solutions, into their data centers,” said Charles Liang, president and CEO of Supermicro. “Our complete liquid cooling solutions can handle up to 100 kW per rack, which reduces the TCO in data centers and allows for denser AI and HPC computing. Our building block architecture allows us to bring the latest GPUs and accelerators to market, and with our trusted suppliers, we continue to bring new rack-scale solutions to the market that ship to customers with a reduced time to delivery.”
Supermicro application-optimized high-performance servers are designed to accommodate the most performant CPUs and GPUs for simulation, data analytics, and machine learning. The Supermicro 4U 8-GPU liquid-cooled server is in a class by itself, delivering petaflops of AI computing power in a dense form factor with the NVIDIA H100/H200 HGX GPUs. Supermicro will soon ship liquid-cooled Supermicro X14 SuperBlade in 8U and 6U configurations, the rackmount X14 Hyper, and the Supermicro X14 BigTwin. Several HPC-optimized server platforms will support the Intel Xeon 6900 with P-cores in a compact, multi-node form factor.
Learn more about how Supermicro Rack Scale Integration Services enable you to reduce costs and optimize your data center: https://www.supermicro.com/en/solutions/rack-integration 
In addition, Supermicro continues its leadership shipping the broadest portfolio of liquid cooled MGX Products in the industry.. Supermicro also confirms its support for delivering the latest accelerators from Intel with its new Intel® Gaudi® 3 accelerator and AMD’s MI300X accelerators. With up to 120 nodes per rack with the Supermicro SuperBlade®, large-scale HPC applications can be executed in just a few racks. Supermicro will display a wide range of servers at the International Supercomputing Conference, including Supermicro X14 systems incorporating the Intel® Xeon® 6 processors.
Supermicro will also showcase and demonstrate a wide range of solutions designed specifically for HPC and AI environments at ISC 2024. The new 4U 8-GPU liquid-cooled servers with NVIDIA HGX H100 and H200 GPUs highlight the Supermicro lineup. These servers and others will support the NVIDIA B200 HGX GPUs when available. New systems with high-end GPUs accelerate AI training and HPC simulation by bringing more data closer to the GPU than previous generations by using high-speed HBM3 memory. With the incredible density of the 4U liquid-cooled servers, a single rack delivers (8 servers x 8 GPUs x 1979 Tflops FP16 (with sparsity) = 126+ petaflops.  The Supermicro SYS-421GE-TNHR2-LCC can use dual 4th or 5th Gen Intel Xeon processors, and the AS -4125GS-TNHR2-LCC is available with dual 4th Gen AMD EPYC™ CPUs.
The new AS -8125GS-TNMR2 server gives users access to 8 AMD Instinct™ MI300X accelerators. This system also includes dual AMD EPYC™ 9004 Series Processors with up to 128 cores/256 threads and up to 6TB memory. Each AMD Instinct MI300X accelerator contains 192GB of HBM3 memory per GPU, all connected with an AMD Universal Base Board (UBB 2.0). Moreover, the new AS -2145GH-TNMR-LCC and AS -4145GH-TNMR APU servers are targeted to accelerate HPC workloads with the MI300A APU. Each APU combines high-performance AMD CPU, GPU, and HBM3 memory for 912 AMD CDNA™ 3 GPU compute units, 96 “Zen 4” cores, and 512GB of unified HBM3 memory in a single system.
At ISC 2024, a Supermicro 8U server with the Intel Gaudi 3 AI Accelerator will be shown. This new system is designed for AI training & Inferencing and can be directly networked with a traditional Ethernet fabric. Twenty-four 200 gigabit (Gb) Ethernet ports are integrated into every Intel Gaudi 3 accelerator, providing flexible and open-standard networking. In addition, 128GB of HBM2e high-speed memory is included. The Intel Gaudi 3 accelerator is designed to scale up and scale out efficiently from a single node to thousands to meet the expansive requirements of GenAI models. Supermicro’s Petascale storage systems, which are critical for large-scale HPC and AI workloads, will also be displayed.
The Supermicro SuperCloud Composer will be demonstrated for the data center management software, showing how, from a single console, an entire data center can be monitored and managed, including the status of all liquid-cooled servers.
Learn more about Supermicro’s presence at ISC 2024 at: https://app.swapcard.com/event/isc-high-performance-2024/exhibitor/RXhoaWJpdG9yXzE1NjYyODE=?
About Super Micro Computer, Inc.
Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first to market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are a Total IT Solutions manufacturer with server, AI, storage, IoT, switch systems, software, and support services. Supermicro’s motherboard, power, and chassis design expertise further enable our development and production, enabling next generation innovation from cloud to edge for our global customers. Our products are designed and manufactured in-house (in the US, Taiwan, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions® allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling).
Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.
All other brands, names, and trademarks are the property of their respective owners.
SMCI-F
Photo – https://mma.prnewswire.com/media/2410556/Supermicro_AI_HPC_Systems.jpg
Logo – https://mma.prnewswire.com/media/1443241/Supermicro_Logo.jpg

View original content:https://www.prnewswire.co.uk/news-releases/supermicros-rack-scale-liquid-cooled-solutions-with-the-industrys-latest-accelerators-target-ai-and-hpc-convergence-302143091.html

Continue Reading

Artificial Intelligence

The Future of AI in Retail Banking Operations

Published

on

the-future-of-ai-in-retail-banking-operations

LONDON, May 13, 2024 /PRNewswire/ — Artificial intelligence (AI) has emerged as a key focus for retail banking operations in recent months. Auriemma Group’s latest series of roundtables tackled the ever-growing space, discussing current usage of AI solutions and how members plan to utilize AI to further enhance their offerings.

“There’s a significant amount of interest in AI across all operational areas,” says Nicole Toussaint, Senior Manager of Industry Roundtables at Auriemma. “Although we’re seeing some hesitation in deploying it, I expect we’ll see some big moves by industry players over the coming months.”
Across all operational areas, firms are hoping to leverage AI to assist front-line employees with navigating their knowledge management systems. This would hopefully improve the accuracy of agent work and help to reduce lengthy training periods. Additionally, firms are considering using the tool enterprise-wide to produce meeting minutes and to assist in communications drafting.
In the Collections and Recoveries space, firms hope to use the tool in their contact strategies. Roundtable members believe that AI can make their contact more effective by creating more bespoke strategies. The technology can help firms decide the optimal time to contact customers, the most effective channel to make contact, and the most engaging messaging and content to use.
Fraud Departments see an opportunity to use AI to improve their current fraud detection models to identify bad actors and fraudulent payments more quickly. Several firms have already partnered with vendors who provide AI-powered fraud mitigation tools.
On the Servicing side, firms are discussing improvements to their chatbot offerings. Currently, many chatbots are FAQ-based, but firms believe that AI can revolutionise the chatbot experience and improve satisfaction scores in the channel.
“We see a clear opportunity to leverage AI to provide dynamic call scripting to front-line agents,” says Toussant. “This would take some of the pressure off agents when servicing and allow banks to provide a more tailored, well-informed customer experience.”
In the Disputes and Chargebacks space, one firm is already using an AI-integrated optical character resolution (OCR) tool to read customer documents provided as proof in a disputes case. Many firms hope to use AI to gain efficiencies in the disputes process. They believe that the tool can help guide agent decisions as they work cases by pulling in bank, scheme, and regulatory policies.
Similarly, in the Complaints space, members see an opportunity for AI to help complaints agents while investigating. AI can not only analyse the materials provided by the complainant but also bring in insights from previously decisioned cases and Financial Ombudsman (FOS) decisions. In this process, the technology can also guide the agent in classifying the complaint type. Some also noted that AI could potentially assist agents with drafting final response letters.
This topic is expected to become an evergreen topic at Auriemma Roundtable meetings, especially as firms identify new use cases and as Auriemma brings in experts from the field to drive further thought leadership.
The next set of roundtable meetings are scheduled for June and July at the Edwardian Hotel in Manchester. If you or any of your colleagues are interested in attending as our guests, please contact us via [email protected].
About Auriemma Group
For 40 years, Auriemma’s mission has been to empower clients with authoritative data and actionable insights. Our team comprises recognised experts in four primary areas: operational effectiveness, consumer research, co-brand partnerships and corporate finance. Our business intelligence and advisory services give clients access to the data, expertise and tools they need to navigate an increasingly complex environment and maximise their performance. Auriemma serves the consumer financial services ecosystem from our offices in London and New York City. For more information, visit us at www.auriemma.group.

View original content:https://www.prnewswire.co.uk/news-releases/the-future-of-ai-in-retail-banking-operations-302141978.html

Continue Reading

Artificial Intelligence

MSI Highlights Optimized AI Platforms to Accelerate Compute-Intensive Applications at ISC 2024

Published

on

msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024

HAMBURG, Germany, May 13, 2024 /PRNewswire/ — MSI, a leading global server provider, brings its latest server platforms powered by AMD processors and 5th Gen Intel® Xeon® Scalable processors, optimized for HPC and AI markets at ISC 2024, booth #F39 in Hamburg, Germany from May 13-15.

“As businesses increasingly adopt AI applications to improve customer experiences, the need for greater computing power and denser deployments has spurred significant shifts in IT infrastructure, driving a widespread adoption of liquid cooling,” said Danny Hsu, General Manager of Enterprise Platform Solutions. “MSI’s AI server platforms empower businesses to achieve efficiency gains while handling more compute-intensive workloads.”
Diverse GPU Platforms to Enhance Performance for AI Workloads
MSI G4201 is a 4U supercomputer designed for exceptional performance in compute-intensive tasks. It features up to eight double-wide PCIe 5.0 x16 slots optimized for high-performance GPU cards, alongside one single-wide PCIe 5.0 x16 expansion slot. Each GPU has a full PCIe 4.0 or 5.0 x16 link directly to the root port complex of a CPU socket without going through a PCIe switch, granting maximum CPU-to-GPU bandwidth. Powered by dual 5th Gen Intel Xeon Scalable Processors and equipped with 32 DDR5 DIMMs, the G4201 platform delivers outstanding heterogeneous computing capabilities for various GPU-based scientific high-performance computing, Generative AI, and inference applications. Additionally, the system includes twelve 3.5-inch drive bays for enhanced functionality.
The G4101 is a 4U 4GPU server platform designed for AI training workloads. It supports a single AMD EPYC™ 9004 Series processor equipped with a liquid cooling module, along with twelve DDR5 RDIMM slots. Additionally, it features four PCIe 5.0 x16 slots tailored for triple-slot graphic cards with coolers, ensuring increased airflow and sustained performance. With twelve front 2.5-inch U.2 NVMe/SATA drive bays, it offers high-speed and flexible storage options, catering to the diverse needs of AI workloads. The G4101 combines air flow spacing and liquid closed-loop cooling, making it the optimal thermal management solution for even the most demanding tasks.
For small-sized businesses, the liquid-cooled S1102-02 server offers an ideal solution, providing superior thermal performance while optimizing costs. Equipped with a single AMD Ryzen™ 7000 Series processor with liquid cooling support of up to 170W, the system features four DDR5 DIMM slots, one PCIe 4.0 slot, two 10GbE onboard Ethernet ports, and four 3.5-inch SATA hot-swappable drive bays in a compact 1U configuration.
Photo – https://mma.prnewswire.com/media/2407458/MSI_s_Diverse_AI_Server_Platforms_Empower_Businesses_for_Enhanced_Efficiency_and_Performance.jpg

View original content:https://www.prnewswire.co.uk/news-releases/msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024-302139397.html

Continue Reading
Advertisement
Advertisement

Latest News

Trending