Editor's Note: This article originally appeared in the February 2026 print edition of Greenhouse Management under the headline “Inner visions.”

Growers extrapolate limited information about the needs of their crops through their senses alone. So, they augment those senses with sensors to collect data on metrics like soil moisture, CO2 levels and EC.
Going deeper has historically required the help of outside labs or specialized equipment. But recent imaging innovations from a Pennsylvania-based tech start-up called Leaficient may give growers the ability to track metabolic processes of their crops in real time to improve yield — all with little more than standard iPhone-quality cameras, a massive dataset and deep machine learning.
It’s been a long journey, one that involves NASA, driverless cars, AI, wearable health monitors and the sneaking suspicion that plants have vital signs that can be tracked, too.
Peering into the leaf with hyperspectral imaging
In 1987, hyperspectral imaging (HSI) technology emerged that could peer into the chemical and material qualities of vegetation in real time. The technology was first put into use above the Earth in high-altitude aircraft that carried the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) developed by NASA scientist and “father of hyperspectral imaging” Alexander Goetz. AVIRIS could see light spectrum in the very high and low bands reflected from the ground, and it was responsible for the discovery of so-called “blue shifts” that signaled plant stress.
Since then, HSI system research has proved valuable for large-scale precision agriculture in field-grown food crops. For instance, satellite HSI systems can accurately extrapolate visual indices from crops to determine pest pressure in real time, allowing farmers to react quicker. Drones fitted with HSI systems can fly over vineyards to give an accurate reading of soil mineral content and vine growth.
However, it’s taken longer for the technology to be applied to crops grown under cover. Reasons for that slow progress include the cost of HSI equipment, the knowledge required to understand the data and the dynamic environments that greenhouses present.
The solution to the problem of affordability and application is a shift in hardware, suggests Leaficient CEO and founder Brian Stancil — as long as that hardware is paired with deep learning AI and robust datasets.

From the battlefield to the greenhouse
Stancil’s unexpected path to the greenhouse began with work he did for the U.S. Department of Defense, where he helped create robots to extract wounded soldiers from the field and dispose of bombs.
In 2016, Stancil started a company called Lifeware Labs, which developed what he calls “a smart Band-Aid.” The wearable health device could be stuck to wounded soldiers in the field and provide information on their vital signs until transport to a higher-level medical unit could be completed.
As his company started to seek FDA approval for the device, the need for software and hardware development slowly transitioned to a need for regulatory expertise. Stancil decided it was time to exit the company and start something new. But what?
“Our director of research is a farmer. His family’s been in farming for generations,” Stancil says. “And he said to me, ‘Well, we’ve been measuring vital signs for people for the last six years. I bet you plants have vital signs as well.’”
That supposition launched the biotech company Leaficient.
Stancil noticed there was a large knowledge gap for those producing in greenhouses. While they could track and control a wealth of environmental information, they had a limited view of what was happening inside the plant. HSI research tech and its insights were largely inaccessible to growers.
“It’s never made it out of the lab because of the prohibitive expense,” Stancil explains.
To lower the cost, he considered driverless car innovations. Autonomous vehicles are dependent on imaging to navigate. Some use a form of lidar to map their surroundings. Others use depth-sensing stereoscopic cameras. Still others use cameras enhanced by software. In fact, the industry has been working on ways to use software solutions to upgrade standard camera imaging to hyperspectral imaging.
Stancil realized he could do the same, using the relatively inexpensive cameras found in high-end consumer electronics and boosting the data resolution through AI to capture nuances the human eye can’t see.
“Our cameras are using a Sony camera chip set that you probably have in your phone. So, that’s what we constrained ourselves to,” Stancil says.
Leaficient’s software looks at the visible light spectrum available to its cameras and uses machine learning trained on massive plant datasets to break the image down into very small bands and see even more. It’s based loosely on spectral reconstruction algorithms that are used to upscale low-resolution digital images.
“The sensitivity of this equipment is so much higher than the human eye,” Stancil says. “We can actually detect down to 10-nanometer chunks, within that sort of 400- to 950-nanometer range. And so that’s where you can start to understand the chemical properties and material properties of the plant.”

The plant naps beneath glass
To develop Leaficient, Stancil needed to work with growers that had crops that grew quickly and uniformly, in a process more closely related to manufacturing than horticulture. Leafy greens producers were the ideal candidates.
“CEA has sort of the power and data infrastructure that’s necessary for persistent monitoring, which is what our cameras do right now,” says Stancil. “We wanted to look at photosynthesis because it kicks off a lot of chemical processes. It ties directly to yield in a lot of different plants. So, that’s where we started.”
The Leaficient team partnered with controlled environment agriculture greenhouses to place as many cameras as possible above leafy green production. These cameras take an image every 10 minutes, making the current dataset somewhere around 10 million individual plant images.
Leaficient software scans the plant images for changes in the xanthophyll cycle, a predictable process that changes based on high or low light intensity levels to help a plant photosynthesize. Xanthophyll reflects light in a very specific band that can be captured via AI as it breaks down the image to the near-infrared range.
“We can look and see which bands are reflected and make estimates on the amount of xanthophyll concentration,” Stancil says. “We’re looking at how that relates to photosynthesis and how photosynthesis relates to yield.”
It’s essentially a way to see plant light saturation levels, a process that has been limited to labs. To be fair, optimal 𝜇mol m-1s-2 ranges to ensure efficient photosynthesis for many commercial plants like leafy greens have been published by researchers. But both plants and greenhouse lighting are dynamic, and in the field, Leaficient’s system was documenting an interesting phenomenon.
“We find for a lot of leafy greens start out photosynthesizing efficiently in the morning, and then around the mid-afternoon, they take naps,” Stancil says. “We call them plant naps because the efficiency sort of tanks off in the early afternoon, and then depending on the greenhouse, it can sort of pop back up in the evening.”
Plotting the photosynthetic efficiency captured by the Leaficient system on a graph created a distinct and unique curve. The “eureka” moment came when comparing that curve to those observed by researchers in lab settings and seeing the same phenomenon documented with much more sophisticated equipment.
“The shape and response to those curves were actually there, and we could start to reproduce it,” Stancil says. “It’s like, OK, alright, there’s a path forward to making this academic sort of research available to farmers at scale.”

The views from above and on the ground
Being able to track photosynthetic efficiency is important for growers. If you are providing the plant light it isn’t using, you’re burning resources. If photosynthesis could be improved, but you’re not providing enough light, you could be losing out on higher yields.
Ohio CEA leafy greens producer Great Lakes Growers has been using Leaficient technology for about eight months. Director of Growing Renato Zardo has had firsthand experience tracking the photosynthetic efficiency of his crop.
“They help a lot,” Zardo says. He notes he’s finally able to find answers to “the main questions about the saturation level of light for my crop at a particular stage.”
The data available to Zardo on the interface is relatively rich. Not only can he see a measure of current photosynthesis, but he can also see the total photosynthesis from the beginning of the day with metrics like temperature and vapor-pressure deficit.
Zardo says that the data gives him valuable insights about whether crops are assimilating or not assimilating light and helps him make strategic decisions like when to close or not close the shades. That’s something his former monitoring and control system, which can only tell him his daily DLI, is missing.
“The camera allows me to see if everything that I’m doing for the crop is translating into growth or not,” he says. “The report that I get from Leaficient is how much correlation I have between photosynthesis and yield. So far, I think we got like 70% of yield prediction according to the photosynthesis, which is good. Now, we’re going to work on details.”


The future coming to light
Both Zardo and Stancil are looking forward to the ability to integrate the Leaficient camera data into greenhouse controllers. Currently, the company is working with several controller manufacturers to make that happen.
Looping Leaficient’s insights into greenhouse controls presents an intriguing glimpse of the future. It may be possible someday soon for a greenhouse crop to determine optimal light on its own, creating a fully closed loop.
That would mean as plants need more light to photosynthesize more efficiently, the light intensity would change automatically in response. And when plants took their afternoon naps, the lights would dim. Stancil even imagines a scenario where the image data from the plants could shift the LED light spectrum automatically.
But it’s possible one of the most valuable resources Leaficient provides is data.
“Growers should understand that this data is an investment, and it’s going to start paying off in near-term ways and long-term ways,” Stancil says, because who knows what other solutions will be able to be built from the data growers can collect? It just needs the right algorithm to unlock it. “I’m very excited for the agriculture industry to start taking advantage of this.”
Explore the February 2026 Issue
Check out more from this issue and find your next story to read.
Latest from Greenhouse Management
- Proven Winners retail survey finds plant quality and selection are top shopping considerations
- The Growth Industry Episode 9: IPPS International Tour preview with Brie Arthur and Liz Erickson
- Weather report
- ‘Horticulture is just the vehicle that I ride’
- Gordie Bailey Jr., third-generation family leader of Bailey Nurseries, passes at 90
- Keep it moving
- New American Floral Endowment scholarship supports global floriculture research
- Seventh edition of Tulip Trade Event planned for March 2026 in Netherlands