Connect with us
MARE BALTICUM Gaming & TECH Summit 2024

Artificial Intelligence

Cairn India selects Platts Dated Brent for Oil Pricing

Published

on

S&P Global Platts, the leading independent provider of information and benchmark prices for the commodities and energy markets, today announced that Cairn India Ltd., a unit of Vedanta Ltd. has adopted the Platts Dated Brent crude oil benchmark to price the majority of its flagship Rajasthan (“RJ”) crude sales. This marks the first time a prominent grade of indigenously produced crude in India has been priced off the global oil benchmark.

Cairn India, the largest private crude oil producer by volume in India, began using Platts Dated Brent for sales of Rajasthan crude in contracts beginning April 2020. Platts Dated Brent is the globally recognized crude benchmark, which estimates suggest is used to price more than 60% of the world’s oil.

With this step, India joins a host of largely sweet crude oil producers in Asia Pacific that have transitioned to using Platts Dated Brent in pricing their crude oil sales over the past decade.

Vera Blei, Head of Oil Markets, S&P Global Platts said: “The selection of Platts Dated Brent as the price benchmark to price one of India’s flagship crude oil grades reflects the confidence Cairn India and buyers of RJ crude have in the quality of our independent price reporting. The integrity of the methodology underpinning our assessment processes allows Platts price benchmarks to be relied upon by the world’s most important energy markets, which explains Cairn’s decision to select Platts Dated Brent for use in its term oil contracts.”

Produced in India’s western state of the same name, Rajasthan is a medium-heavy and waxy sweet crude with API of 29.50 degrees and sulfur of 0.086% by weight. The Rajasthan blocks, the first of which was discovered in 2004, are considered to be the largest onshore hydrocarbon find in India in decades. Most of Rajasthan crude flows to refineries located on the west coast of India.

Artificial Intelligence

Supermicro’s Rack Scale Liquid-Cooled Solutions with the Industry’s Latest Accelerators Target AI and HPC Convergence

Published

on

supermicro’s-rack-scale-liquid-cooled-solutions-with-the-industry’s-latest-accelerators-target-ai-and-hpc-convergence

Complete Data Center Liquid-Cooled Solutions Enable AI Factories to be Constructed at Unprecedented Speeds Using the Latest Dense GPU Servers Equipped with the Highest-Performing CPUs and GPUs
SAN JOSE, Calif. and HAMBURG, Germany, May 13, 2024 /PRNewswire/ — International Supercomputing Conference (ISC) — Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is addressing the most demanding requirements from customers who want to expand their AI and HPC capacities while reducing data center power requirements. Supermicro delivers complete liquid-cooled solutions, including cold plates, CDUs, CDMs, and entire cooling towers. A significant reduction in the PUE of a data center is quickly realized with data center liquid-cooled servers and infrastructure, and this can reduce overall power consumption in the data center by up to 40%.

“Supermicro continues to work with our AI and HPC customers to bring the latest technology, including total liquid cooling solutions, into their data centers,” said Charles Liang, president and CEO of Supermicro. “Our complete liquid cooling solutions can handle up to 100 kW per rack, which reduces the TCO in data centers and allows for denser AI and HPC computing. Our building block architecture allows us to bring the latest GPUs and accelerators to market, and with our trusted suppliers, we continue to bring new rack-scale solutions to the market that ship to customers with a reduced time to delivery.”
Supermicro application-optimized high-performance servers are designed to accommodate the most performant CPUs and GPUs for simulation, data analytics, and machine learning. The Supermicro 4U 8-GPU liquid-cooled server is in a class by itself, delivering petaflops of AI computing power in a dense form factor with the NVIDIA H100/H200 HGX GPUs. Supermicro will soon ship liquid-cooled Supermicro X14 SuperBlade in 8U and 6U configurations, the rackmount X14 Hyper, and the Supermicro X14 BigTwin. Several HPC-optimized server platforms will support the Intel Xeon 6900 with P-cores in a compact, multi-node form factor.
Learn more about how Supermicro Rack Scale Integration Services enable you to reduce costs and optimize your data center: https://www.supermicro.com/en/solutions/rack-integration 
In addition, Supermicro continues its leadership shipping the broadest portfolio of liquid cooled MGX Products in the industry.. Supermicro also confirms its support for delivering the latest accelerators from Intel with its new Intel® Gaudi® 3 accelerator and AMD’s MI300X accelerators. With up to 120 nodes per rack with the Supermicro SuperBlade®, large-scale HPC applications can be executed in just a few racks. Supermicro will display a wide range of servers at the International Supercomputing Conference, including Supermicro X14 systems incorporating the Intel® Xeon® 6 processors.
Supermicro will also showcase and demonstrate a wide range of solutions designed specifically for HPC and AI environments at ISC 2024. The new 4U 8-GPU liquid-cooled servers with NVIDIA HGX H100 and H200 GPUs highlight the Supermicro lineup. These servers and others will support the NVIDIA B200 HGX GPUs when available. New systems with high-end GPUs accelerate AI training and HPC simulation by bringing more data closer to the GPU than previous generations by using high-speed HBM3 memory. With the incredible density of the 4U liquid-cooled servers, a single rack delivers (8 servers x 8 GPUs x 1979 Tflops FP16 (with sparsity) = 126+ petaflops.  The Supermicro SYS-421GE-TNHR2-LCC can use dual 4th or 5th Gen Intel Xeon processors, and the AS -4125GS-TNHR2-LCC is available with dual 4th Gen AMD EPYC™ CPUs.
The new AS -8125GS-TNMR2 server gives users access to 8 AMD Instinct™ MI300X accelerators. This system also includes dual AMD EPYC™ 9004 Series Processors with up to 128 cores/256 threads and up to 6TB memory. Each AMD Instinct MI300X accelerator contains 192GB of HBM3 memory per GPU, all connected with an AMD Universal Base Board (UBB 2.0). Moreover, the new AS -2145GH-TNMR-LCC and AS -4145GH-TNMR APU servers are targeted to accelerate HPC workloads with the MI300A APU. Each APU combines high-performance AMD CPU, GPU, and HBM3 memory for 912 AMD CDNA™ 3 GPU compute units, 96 “Zen 4” cores, and 512GB of unified HBM3 memory in a single system.
At ISC 2024, a Supermicro 8U server with the Intel Gaudi 3 AI Accelerator will be shown. This new system is designed for AI training & Inferencing and can be directly networked with a traditional Ethernet fabric. Twenty-four 200 gigabit (Gb) Ethernet ports are integrated into every Intel Gaudi 3 accelerator, providing flexible and open-standard networking. In addition, 128GB of HBM2e high-speed memory is included. The Intel Gaudi 3 accelerator is designed to scale up and scale out efficiently from a single node to thousands to meet the expansive requirements of GenAI models. Supermicro’s Petascale storage systems, which are critical for large-scale HPC and AI workloads, will also be displayed.
The Supermicro SuperCloud Composer will be demonstrated for the data center management software, showing how, from a single console, an entire data center can be monitored and managed, including the status of all liquid-cooled servers.
Learn more about Supermicro’s presence at ISC 2024 at: https://app.swapcard.com/event/isc-high-performance-2024/exhibitor/RXhoaWJpdG9yXzE1NjYyODE=?
About Super Micro Computer, Inc.
Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first to market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are a Total IT Solutions manufacturer with server, AI, storage, IoT, switch systems, software, and support services. Supermicro’s motherboard, power, and chassis design expertise further enable our development and production, enabling next generation innovation from cloud to edge for our global customers. Our products are designed and manufactured in-house (in the US, Taiwan, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions® allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling).
Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.
All other brands, names, and trademarks are the property of their respective owners.
SMCI-F
Photo – https://mma.prnewswire.com/media/2410556/Supermicro_AI_HPC_Systems.jpg
Logo – https://mma.prnewswire.com/media/1443241/Supermicro_Logo.jpg

View original content:https://www.prnewswire.co.uk/news-releases/supermicros-rack-scale-liquid-cooled-solutions-with-the-industrys-latest-accelerators-target-ai-and-hpc-convergence-302143091.html

Continue Reading

Artificial Intelligence

The Future of AI in Retail Banking Operations

Published

on

the-future-of-ai-in-retail-banking-operations

LONDON, May 13, 2024 /PRNewswire/ — Artificial intelligence (AI) has emerged as a key focus for retail banking operations in recent months. Auriemma Group’s latest series of roundtables tackled the ever-growing space, discussing current usage of AI solutions and how members plan to utilize AI to further enhance their offerings.

“There’s a significant amount of interest in AI across all operational areas,” says Nicole Toussaint, Senior Manager of Industry Roundtables at Auriemma. “Although we’re seeing some hesitation in deploying it, I expect we’ll see some big moves by industry players over the coming months.”
Across all operational areas, firms are hoping to leverage AI to assist front-line employees with navigating their knowledge management systems. This would hopefully improve the accuracy of agent work and help to reduce lengthy training periods. Additionally, firms are considering using the tool enterprise-wide to produce meeting minutes and to assist in communications drafting.
In the Collections and Recoveries space, firms hope to use the tool in their contact strategies. Roundtable members believe that AI can make their contact more effective by creating more bespoke strategies. The technology can help firms decide the optimal time to contact customers, the most effective channel to make contact, and the most engaging messaging and content to use.
Fraud Departments see an opportunity to use AI to improve their current fraud detection models to identify bad actors and fraudulent payments more quickly. Several firms have already partnered with vendors who provide AI-powered fraud mitigation tools.
On the Servicing side, firms are discussing improvements to their chatbot offerings. Currently, many chatbots are FAQ-based, but firms believe that AI can revolutionise the chatbot experience and improve satisfaction scores in the channel.
“We see a clear opportunity to leverage AI to provide dynamic call scripting to front-line agents,” says Toussant. “This would take some of the pressure off agents when servicing and allow banks to provide a more tailored, well-informed customer experience.”
In the Disputes and Chargebacks space, one firm is already using an AI-integrated optical character resolution (OCR) tool to read customer documents provided as proof in a disputes case. Many firms hope to use AI to gain efficiencies in the disputes process. They believe that the tool can help guide agent decisions as they work cases by pulling in bank, scheme, and regulatory policies.
Similarly, in the Complaints space, members see an opportunity for AI to help complaints agents while investigating. AI can not only analyse the materials provided by the complainant but also bring in insights from previously decisioned cases and Financial Ombudsman (FOS) decisions. In this process, the technology can also guide the agent in classifying the complaint type. Some also noted that AI could potentially assist agents with drafting final response letters.
This topic is expected to become an evergreen topic at Auriemma Roundtable meetings, especially as firms identify new use cases and as Auriemma brings in experts from the field to drive further thought leadership.
The next set of roundtable meetings are scheduled for June and July at the Edwardian Hotel in Manchester. If you or any of your colleagues are interested in attending as our guests, please contact us via [email protected].
About Auriemma Group
For 40 years, Auriemma’s mission has been to empower clients with authoritative data and actionable insights. Our team comprises recognised experts in four primary areas: operational effectiveness, consumer research, co-brand partnerships and corporate finance. Our business intelligence and advisory services give clients access to the data, expertise and tools they need to navigate an increasingly complex environment and maximise their performance. Auriemma serves the consumer financial services ecosystem from our offices in London and New York City. For more information, visit us at www.auriemma.group.

View original content:https://www.prnewswire.co.uk/news-releases/the-future-of-ai-in-retail-banking-operations-302141978.html

Continue Reading

Artificial Intelligence

MSI Highlights Optimized AI Platforms to Accelerate Compute-Intensive Applications at ISC 2024

Published

on

msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024

HAMBURG, Germany, May 13, 2024 /PRNewswire/ — MSI, a leading global server provider, brings its latest server platforms powered by AMD processors and 5th Gen Intel® Xeon® Scalable processors, optimized for HPC and AI markets at ISC 2024, booth #F39 in Hamburg, Germany from May 13-15.

“As businesses increasingly adopt AI applications to improve customer experiences, the need for greater computing power and denser deployments has spurred significant shifts in IT infrastructure, driving a widespread adoption of liquid cooling,” said Danny Hsu, General Manager of Enterprise Platform Solutions. “MSI’s AI server platforms empower businesses to achieve efficiency gains while handling more compute-intensive workloads.”
Diverse GPU Platforms to Enhance Performance for AI Workloads
MSI G4201 is a 4U supercomputer designed for exceptional performance in compute-intensive tasks. It features up to eight double-wide PCIe 5.0 x16 slots optimized for high-performance GPU cards, alongside one single-wide PCIe 5.0 x16 expansion slot. Each GPU has a full PCIe 4.0 or 5.0 x16 link directly to the root port complex of a CPU socket without going through a PCIe switch, granting maximum CPU-to-GPU bandwidth. Powered by dual 5th Gen Intel Xeon Scalable Processors and equipped with 32 DDR5 DIMMs, the G4201 platform delivers outstanding heterogeneous computing capabilities for various GPU-based scientific high-performance computing, Generative AI, and inference applications. Additionally, the system includes twelve 3.5-inch drive bays for enhanced functionality.
The G4101 is a 4U 4GPU server platform designed for AI training workloads. It supports a single AMD EPYC™ 9004 Series processor equipped with a liquid cooling module, along with twelve DDR5 RDIMM slots. Additionally, it features four PCIe 5.0 x16 slots tailored for triple-slot graphic cards with coolers, ensuring increased airflow and sustained performance. With twelve front 2.5-inch U.2 NVMe/SATA drive bays, it offers high-speed and flexible storage options, catering to the diverse needs of AI workloads. The G4101 combines air flow spacing and liquid closed-loop cooling, making it the optimal thermal management solution for even the most demanding tasks.
For small-sized businesses, the liquid-cooled S1102-02 server offers an ideal solution, providing superior thermal performance while optimizing costs. Equipped with a single AMD Ryzen™ 7000 Series processor with liquid cooling support of up to 170W, the system features four DDR5 DIMM slots, one PCIe 4.0 slot, two 10GbE onboard Ethernet ports, and four 3.5-inch SATA hot-swappable drive bays in a compact 1U configuration.
Photo – https://mma.prnewswire.com/media/2407458/MSI_s_Diverse_AI_Server_Platforms_Empower_Businesses_for_Enhanced_Efficiency_and_Performance.jpg

View original content:https://www.prnewswire.co.uk/news-releases/msi-highlights-optimized-ai-platforms-to-accelerate-compute-intensive-applications-at-isc-2024-302139397.html

Continue Reading

Trending