Blog
Google Cloud Transforming Healthcare With AI
At the recent HIMSS24 event in Orlando, Google Cloud unveiled a set of new solutions designed to enhance interoperability, establish a robust data foundation, and deploy generative AI tools within the healthcare and life sciences sectors, all of which promise to improve patient outcomes.
Quick Take: SPDX 3.0 Release
The SPDX community, in collaboration with the Linux Foundation, recently released SPDX 3.0, marking a significant milestone in the Software Bill of Materials (SBOM) communication format.
SPDX 3.0 provides a comprehensive set of updates, including an overhauled model, specification, and license list and the addition of SPDX profiles designed to handle modern system use cases. This release improves the versatility and adaptability of the SBOM format.
What Microsoft’s FQ3 2024 Earnings Reveal About Azure
Microsoft announced its fiscal Q3 2024 results, demonstrating robust growth. The company reported a total revenue of $61.9 billion, a 17% increase year-over-year, primarily driven by the cloud segment.
Microsoft’s Cloud business, which includes Azure and other cloud properties such as LinkedIn, generated over $35 billion in revenue, up 23%. During its earnings call, Microsoft said that Azure’s expansion and the increasing demand for AI infrastructure propelled much of its growth.
What Alphabet Earnings Tell Us About Google Cloud
Google Cloud showed significant growth during the quarter, driven by the increasing demand for AI and cloud-based solutions. The business is positioned for continued growth, with a focus on infrastructure leadership, AI-driven innovations, and strong customer adoption.
Quick Take: Western Digital’s FQ3 2024 Earnings
Western Digital reported strong earnings for its third quarter of fiscal 2024, with revenue reaching $3.5 billion, a non-GAAP gross margin of 29.3%, and non-GAAP earnings per share at $0.63. WD exceeded consensus estimates.
Quick Take: Apple’s OpenELM Small Language Model
Apple this week unveiled its OpenELM, a set of small AI language models designed to run on local devices like smartphones rather than rely on cloud-based data centers. This reflects a growing trend toward smaller, more efficient AI models that can operate on consumer devices without significant computational resources.