A young Queensland researcher has been recognised for his efforts in developing ‘superhuman vision’ for meat processing workers that takes the guesswork out of where the ‘fat ends and the lean beef starts’ during the boning phase.
Engineer Fraser Border was the major award winner in the 2021 Science and Innovation Awards for Young People in Agriculture, Fisheries and Forestry, announced on Friday. The awards are made annually to young scientists, researchers and innovators 18-35 who have a project that will benefit Australia’s primary industries.
Agriculture Minister David Littleproud announced Mr Border as the winner of the Minister’s Award at this year’s Awards for his trailblazing augmented reality visualisation technology.
“I am delighted to announce Fraser as the winner of the Minister’s Award, and much deserved too,” Mr Littleproud said.
“This is cutting-edge smarts that could stop meat workers from flying blind when trimming strip loin. Fraser’s superhuman goggles could potentially save the meat processing industry millions of dollars a year, with reduced errors and higher yields.”
The technology will not only enhance operator performance but also serve as a pilot towards the adoption of wearables, AR and futuristic technologies.
“This is innovation, insight and ingenuity at its very best and will take our already sophisticated meat industry to even higher levels,” Mr Littleproud said.
Mr Border, a University of Southern Queensland researcher, receives $22,000 in grant funding from industry for winning the Australian Meat Processor Corporation category – and a further $22,000 from government for winning the Minister’s Award for his extended research project.
Providing ‘Super-human vision’
Engineer and researcher Fraser Border aims to give meat processing workers superhuman vision with a tool that shows them where to trim beef strip loin.
Meat processing workers trimming striploins are faced with a near impossible task. They must trim the meat so a particular width of fat remains on the beef, without actually being able to see where the fat meets the lean tissue. Making it even harder is that different countries have different trim specifications.
“While the people on the line are really skilled in their own right, they’re having to guess,” Mr Border said. “They can’t see within the meat to know where to trim to.”
Errors in the process are estimated to cost processors more than $89 million a year. Mr Border’s research hopes to stop slicers from ‘flying blind’ with a tool to visualise where the fat ends and the lean beef starts. His project will employ advanced ultrasound sensors and visualisation techniques to give workers the capability to see sub-surface features.
“Those ultrasound waves will propagate through the fat and when the transducer gets the echo, it can calculate how deep that fat was,” he said.
Meat slicers will be able to see this interface on a simple display to simplify their task and assist them to guide their blade and achieve higher yields.
The idea for the project stemmed from Mr Border’s experience working in industry, consulting and the university sector. He had originally been looking to automate beef trimming but realised the industry’s biggest challenge was using their diverse workforce more efficiently.
“The more you get into robotics, the more you realise just how great humans are at doing things,” he said.
Mr Border is planning to have a working prototype by the end of the project. If successful, he plans to make the tool more immersive by integrating augmented reality to achieve better results on-the-line.
HAVE YOUR SAY