For decades, the tech industry has followed a simple rule: smaller transistors = faster computers. Transistors—tiny electronic switches that process data—are the building blocks of every chip, and the more you can pack onto a processor, the more powerful it becomes. This formula has given us the modern world, from smartphones to AI.
We’ve squeezed as much power as we can out of traditional silicon chips. Engineers are now cramming billions of microscopic transistors onto them, but each new advance requires more energy and produces more heat. The result? Data centers—the backbone of AI, cloud computing, and everything online—are turning into energy-hungry giants. If trends continue, some experts predict that by 2040, computing could consume 80% of the world’s electricity.
At the same time, AI is demanding even more power. Training a single advanced AI model today requires as much energy as hundreds of homes use in a year. The world needs a new approach—one that delivers more speed without the energy drain.
That’s exactly what one startup is betting on. Instead of relying entirely on electricity, it’s turning to light. By using photons instead of electrons, this company has developed a radically new type of AI chip—one that can process information faster, with far less power. And investors are paying attention: it’s already raised $850 million and is now valued at $4.4 billion.
That company? Lightmatter.
Lightmatter was founded in 2017 by Nicholas Harris, Darius Bunandar, and Thomas Graham, who spun the company out of the Massachusetts Institute of Technology (MIT) after developing early breakthroughs in photonic computing. Their goal was to rethink AI hardware from the ground up, replacing traditional electronic interconnects with light-based data transfer to overcome the growing energy and scalability challenges in computing.
Since then, Lightmatter has built deep partnerships with semiconductor leaders—including AMD, NVIDIA, Intel, and Qualcomm—to integrate its photonic interconnect substrate into next-generation AI hardware. Rather than competing directly with chipmakers, Lightmatter’s technology enhances their processors by enabling faster, more efficient data transfer. According to CEO Nicholas Harris, Lightmatter’s substrate is “the foundation for how computing will make progress. It’ll reduce the energy consumption of these clusters dramatically and enable scaling to a million nodes and beyond [which] is how you get to wafer-scale, the biggest chips in the world, and AGI.”
On real-world AI workloads, Lightmatter’s system is up to 10x faster than NVIDIA’s A100 chip while using just one-fifth of the power. It also delivers eight times the throughput of an NVIDIA server blade, underscoring its efficiency advantages. These performance gains could make photonic interconnects a crucial part of future AI infrastructure, where power constraints and scalability challenges are becoming major bottlenecks.
In November 2024, Lightmatter announced a strategic partnership with Amkor Technology, one of the largest semiconductor packaging and testing providers. Together, they are building the world’s largest 3D-packaged chip complex using Lightmatter’s Passage interconnect technology. This collaboration aims to push the limits of multi-die chip packaging, a technique that divides large semiconductor chips into smaller dies within a single package. By integrating Lightmatter’s photonic technology, the system can dramatically reduce the energy required for data transfer between components, paving the way for higher-performance AI and high-performance computing (HPC) architectures.
As of February 2025, Lightmatter employed an estimated 191 people across its Boston and Mountain View offices, a 46% increase from 2023. This headcount expansion reflects the company’s accelerating commercialization efforts as it prepares for broader deployment of its technology in AI data centers.
Lightmatter operates at the intersection of AI hardware and photonic computing, developing next-generation processors and interconnect systems designed to improve the speed and energy efficiency of AI workloads. Unlike traditional semiconductor companies that rely solely on electronic circuits, Lightmatter’s technology integrates both light (photons) and electricity (electrons) to process and transfer data more efficiently.
Rather than manufacturing general-purpose chips, Lightmatter focuses on high-performance AI infrastructure, offering a full-stack solution that includes both photonic processors and optical interconnects. This positions the company as a critical supplier to data centers looking to scale AI workloads without hitting power and efficiency bottlenecks.
Lightmatter generates revenue through high-value capital equipment sales to cloud providers, AI companies, and large data center operators. The company’s ecosystem includes:
Envise – Photonic AI Processor
Designed to accelerate AI workloads by using light for computations.
Reduces power consumption while improving performance compared to traditional GPUs and AI accelerators.
Competes with NVIDIA’s AI chips, offering an energy-efficient alternative.
Passage – Photonic Interconnect System
Uses light to transfer data between processors, enabling ultra-fast communication between chips.
Allows AI clusters to scale to hundreds of thousands of processors without energy bottlenecks.
Competes with traditional electrical interconnects from semiconductor giants.
By selling both processors and interconnects, Lightmatter benefits from multiple revenue streams within a single deployment, creating a sticky ecosystem where customers adopting its technology gain efficiency advantages that are difficult to replicate with traditional solutions.
Lightmatter follows an enterprise-first strategy, focusing on securing large-scale deals with cloud computing providers and AI infrastructure firms rather than targeting individual consumers or small businesses.
Key advantages of this approach:
Strategic Partnerships – Works closely with hyperscalers and AI hardware firms to integrate its technology into existing data center infrastructure.
Silicon-Compatible Manufacturing – Lightmatter’s technology can be fabricated using existing semiconductor processes, reducing production complexity.
High Switching Costs – Customers investing in Lightmatter’s ecosystem benefit from deep integration and long-term efficiency gains, making it harder to switch to competitors.
By positioning itself as a next-generation AI infrastructure provider, Lightmatter is targeting the multi-billion-dollar AI computing and data center market, competing with industry giants like NVIDIA and Intel, while pioneering an entirely new category of photonic computing.
Despite its technological advancements and strong partnerships with semiconductor firms like AMD, NVIDIA, Intel, and Qualcomm, Lightmatter has yet to publicly announce major customers deploying its technology at scale. While it has formed strategic partnerships with companies like Amkor Technology and GlobalFoundries to bring its products to market, the lack of disclosed commercial customers raises execution risks:
Unproven Market Adoption – Without clear public validation from major cloud providers or AI firms, it’s uncertain how quickly Lightmatter’s solutions will be adopted.
Competitive Pressures – Semiconductor giants like NVIDIA, Intel, and AMD have strong customer relationships and may develop competing solutions before Lightmatter achieves widespread deployment.
Long Sales Cycles – Selling AI infrastructure involves extended testing and integration processes. If potential customers delay adoption, revenue growth could be slower than expected.
For investors, the key question remains: Will Lightmatter convert its technological edge into large-scale commercial deals? The next phase of its growth will depend on turning partnerships into paying customers and proving that photonic computing can deliver real-world efficiency gains at scale.
The demand for computing power is outpacing the capabilities of traditional semiconductor technology. AI models are growing exponentially in complexity, requiring vast amounts of energy and computational resources. At the same time, the semiconductor industry is reaching physical limits—transistors are nearing their smallest practical size, and improvements in power efficiency are slowing. This combination of rising AI demand and stagnating hardware efficiency is creating a bottleneck in AI infrastructure, driving the need for breakthrough technologies that can sustain performance growth without excessive energy consumption.
Lightmatter’s photonic computing offers a way to bypass these constraints, using light-based interconnects to improve speed, efficiency, and scalability. By reducing energy consumption and enabling higher-density AI clusters, Lightmatter could become a foundational player in the next generation of data center infrastructure. If its technology delivers on its promise, it could redefine how AI workloads are processed, providing an alternative to the power-hungry GPUs and networking bottlenecks that currently dominate AI infrastructure.
A key area of opportunity lies in integrating Lightmatter’s photonic technology into large-scale AI infrastructure. Modern AI clusters consist of tens of thousands of processors, pushing traditional interconnects to their limits. Lightmatter’s Passage photonic interconnect system allows these clusters to scale more efficiently by reducing latency and lowering power consumption, both of which are critical for handling large AI workloads. If the technology gains widespread adoption, Lightmatter could capture a meaningful share of the multi-billion-dollar AI networking and processing market. The company is already working with major semiconductor firms to integrate its interconnects, a sign that the industry is looking for alternatives to traditional chip scaling. Beyond accelerating AI inference, Lightmatter’s platform could also expand into AI model training, a process that demands even greater computational power. If the company moves into this segment, its total addressable market could grow significantly.
Beyond AI, the broader data center industry is facing increasing regulatory and economic pressure to reduce energy consumption. Traditional electronic interconnects generate heat and consume significant power, limiting scalability as data centers expand. Cloud providers are actively searching for next-generation computing architectures that can support AI workloads without overwhelming power grids. Lightmatter’s photonic interconnects provide a potential solution by enabling more energy-efficient data centers that can scale without hitting thermal or power constraints. If the technology proves viable at scale, it could become a critical part of next-generation cloud and hyperscale data centers, offering a path forward for sustainable AI infrastructure.
While Lightmatter’s immediate focus is on AI and data centers, photonic computing has the potential to expand into other industries that require high-speed, low-latency processing. Financial institutions engaged in high-frequency trading could benefit from faster and more energy-efficient computing for real-time market analysis and algorithmic trading. The telecommunications industry could also see gains in network switching and data transfer efficiency by incorporating photonic interconnect technology. In high-performance computing (HPC), where supercomputers are used for scientific research and simulations, Lightmatter’s technology could provide the processing speed and energy efficiency needed for complex workloads. Longer term, as manufacturing scales and costs decline, photonic chips could find applications in mainstream computing, from enterprise IT infrastructure to personal devices.
The convergence of AI growth, semiconductor limitations, and rising energy constraints presents a unique moment for Lightmatter. If the industry shifts toward photonic computing, the company is well-positioned as a leading contender to shape this transition. However, the timeline for adoption remains uncertain. AI hardware is a capital-intensive industry with long sales cycles, and widespread adoption will depend on proving that photonic computing can reliably deliver efficiency and performance gains in real-world deployments. The next few years will be critical in determining whether Lightmatter’s technology can transition from a promising innovation to a fundamental part of AI infrastructure.
Lightmatter operates in the emerging field of photonic computing, where it faces competition from both specialized startups and established semiconductor giants. While photonic computing remains a nascent industry, the accelerating demand for AI hardware has attracted increasing investment and technological development.
Several early-stage companies are also developing photonic computing solutions, though none have matched Lightmatter’s funding scale or commercialization progress.
Luminous Computing – A venture-backed startup working on photonic AI accelerator chips. It raised $105 million in Series A funding, but its technology remains at an earlier stage, lacking the full-stack approach Lightmatter has developed.
Celestial AI – Raised $175 million in a Series C round and focuses exclusively on optical interconnect technology. Unlike Lightmatter, Celestial AI does not produce photonic AI processors, making its market scope more limited.
Beyond full photonic computing solutions, Lightmatter also competes with companies specializing in optical interconnects, which enable faster data transfer between processors.
Ayar Labs – The leading player in optical input/output (I/O), having raised $155 million in funding and reaching a $1 billion valuation. Unlike Lightmatter, which offers both AI processing and interconnect technology, Ayar Labs focuses solely on optical I/O. Optical I/O replaces traditional electrical connections with light-based communication, reducing energy consumption and heat buildup in high-performance computing systems. However, Ayar Labs does not build AI processors, making it a narrower solution compared to Lightmatter.
The largest competitive threat to Lightmatter comes from the traditional semiconductor giants that dominate AI hardware, including NVIDIA, Intel, and AMD. These companies control the graphics processing unit (GPU) and AI accelerator markets, giving them extensive customer relationships, supply chains, and research and development budgets.
NVIDIA – The undisputed leader in AI chips, with its H100 GPUs serving as the backbone of most AI training and inference workloads. GPUs are specialized processors designed to handle large-scale parallel computing, making them ideal for AI applications. While NVIDIA has not yet launched a photonic computing product, it has the resources to develop or acquire competing technology if photonic computing proves viable at scale.
Intel – A strategic backer of Ayar Labs, signaling its interest in integrating optical interconnects into its next-generation chips. Intel’s existing relationships with data centers and cloud providers give it a strong advantage in commercializing new semiconductor technologies.
AMD – Like NVIDIA, AMD has a dominant presence in high-performance computing and AI acceleration. While it has not directly invested in photonics, it remains a potential competitor should it decide to pursue this technology.
Despite these challenges, Lightmatter has a first-mover advantage in photonic AI computing and benefits from manufacturing compatibility with existing semiconductor processes. However, its success depends on converting technological promise into commercial traction before larger players enter the space. If Lightmatter secures early adoption from cloud providers, it could establish a stronghold before incumbents develop competing solutions.
Lightmatter’s valuation has surged in recent years, reflecting investor confidence in photonic computing and the increasing urgency to develop more energy-efficient AI hardware. Since its founding, the company has attracted backing from some of the most well-regarded venture and institutional investors, many of whom have continued to reinvest across multiple rounds.
GV (Google Ventures) has been an investor since Lightmatter’s Series A in 2018, alongside Matrix Partners, Spark Capital, and Viking Global Investors. As the company progressed, top-tier investors doubled down, with Viking, GV, and SIP Global Partners participating in the Series B in 2021, a round that also brought in HP Enterprise and Lockheed Martin—an early sign of industry interest in Lightmatter’s technology.
Investor commitment deepened in 2023, when Lightmatter raised a $153 million Series C led by Fidelity, with continued participation from GV, Viking, and HPE Pathfinder. By October 2024, Lightmatter secured a $400 million Series D at a $4.4 billion valuation, with T. Rowe Price Associates and Fidelity leading the round. GV remained a participant, marking its fourth consecutive round—a rare signal of long-term conviction from one of the most respected venture firms in deep tech.
While Lightmatter has not publicly disclosed customers or revenue figures, the sustained investment from high-profile firms provides a strong market signal. Venture firms often exit after early-stage growth, but GV, Viking, and SIP Global Partners have stayed invested across multiple funding rounds, suggesting confidence in Lightmatter’s long-term potential.
Compared to other AI hardware startups, Lightmatter’s fundraising trajectory stands out. Ayar Labs, a key competitor in optical interconnects, has raised only $155 million to date, while Celestial AI secured $175 million, both significantly lower than Lightmatter’s $850 million total. The increasing presence of large institutional investors—rather than purely venture capital—suggests a shift toward late-stage preparation, with expectations for significant commercial adoption in the coming years.
Pros | Cons |
Solves AI’s Growing Energy Crisis – Lightmatter’s technology addresses critical energy bottlenecks in AI computing, allowing data centers to scale efficiently while significantly reducing power consumption. | Unproven Commercial Adoption – The company has not publicly disclosed major customers, making it difficult to assess how quickly enterprises are adopting its technology. |
Full-Stack Photonic Computing – Unlike competitors that focus on either AI processors or interconnects, Lightmatter provides both, creating a more integrated and scalable solution for AI infrastructure. | Strong Incumbent Competition – Semiconductor giants like NVIDIA, Intel, and AMD have deep customer relationships and the resources to develop or acquire competing photonic solutions. |
Compatible with Existing Semiconductor Manufacturing – Lightmatter’s chips can be fabricated using standard semiconductor processes, reducing production barriers and easing large-scale adoption. | High Capital Requirements – Developing and scaling photonic computing is expensive, requiring ongoing investment to remain competitive and reach mass production. |
Continued Investment from Top-Tier Firms – Lightmatter has raised $850 million, with repeat investments from GV, Fidelity, Viking Global Investors, and T. Rowe Price, signaling long-term confidence. | Long Adoption Timelines – Transitioning from electronic computing to photonic solutions requires extensive testing, validation, and infrastructure integration, slowing down adoption. |
Potential Beyond AI – While AI and data centers are the primary market, photonic computing could expand into high-frequency trading, telecommunications, and high-performance computing. | Execution Risk – Turning its technological edge into large-scale enterprise deals remains a challenge, and delays in commercialization could impact its valuation and long-term prospects. |
Lightmatter shares are currently available on secondary markets at a notable discount compared to the company’s most recent Series D funding round in October 2024, where shares were priced at $80.23. Given how recently that round closed, the available secondary market prices suggest an opportunity to invest below the valuation set by institutional investors like T. Rowe Price and Fidelity.
One reason for these discounted opportunities is that early employees are selling shares, which often leads to attractive pricing in secondary markets. These sellers may be looking to liquidate some of their holdings after years at the company, creating openings for investors who are comfortable navigating private market transactions.
Platform | Structure | Price Per Share | Discount to Series D | Minimum Investment |
---|---|---|---|---|
Forge | Direct Trade | $63.00 | 21.5% disount | No minlisted |
Hiive | SPV | $63.25 | 21.2% discount | $25K min |
EquityZen | Direct Trade | $68.00 | 15.3% discount | $150K min |
Direct trades provide direct exposure to the cap table, meaning investors own the shares outright. However, they also require significant due diligence, as purchasing private shares involves understanding transfer restrictions, company financials, and potential liquidity timelines. Investors should evaluate whether they are comfortable with holding periods, legal considerations, and valuation risks before proceeding with a direct trade.
For a deeper dive into different private market investment structures—including SPVs, direct trades, and fund-based opportunities—review our guide here: Private Company Investment Types.
For any questions about investing in private markets, reach out to [email protected].
This material has been distributed solely for informational and educational purposes only and is not a solicitation or an offer to buy any security or to participate in any trading strategy. All material presented is compiled from sources believed to be reliable, but accuracy, adequacy, or completeness cannot be guaranteed, and Cold Capital makes no representation as to its accuracy, adequacy, or completeness.
The information herein is based on Cold Capital’s beliefs, as well as certain assumptions regarding future events based on information available to Cold Capital on a formal and informal basis as of the date of this publication. The material may include projections or other forward-looking statements regarding future events, targets or expectations. Past performance of a company is no guarantee of future results. There is no guarantee that any opinions, forecasts, projections, risk assumptions, or commentary discussed herein will be realized. Actual experience may not reflect all of these opinions, forecasts, projections, risk assumptions, or commentary.
Cold Capital shall have no responsibility for: (i) determining that any opinions, forecasts, projections, risk assumptions, or commentary discussed herein is suitable for any particular reader; (ii) monitoring whether any opinions, forecasts, projections, risk assumptions, or commentary discussed herein continues to be suitable for any reader; or (iii) tailoring any opinions, forecasts, projections, risk assumptions, or commentary discussed herein to any particular reader’s objectives, guidelines, or restrictions. Receipt of this material does not, by itself, imply that Cold Capital has an advisory agreement, oral or otherwise, with any reader.
Different types of investments involve varying degrees of risk, and there can be no assurance that the future performance of any specific investment, investment strategy, company or product made reference to directly or indirectly in this material, will be profitable, equal any corresponding indicated performance level(s), or be suitable for your portfolio. Due to rapidly changing market conditions and the complexity of investment decisions, supplemental information and other sources may be required to make informed investment decisions based on your individual investment objectives and suitability specifications. All expressions of opinions are subject to change without notice. Investors should seek financial advice regarding the appropriateness of investing in any security of the company discussed in this presentation.