Advancing Computer to Human Interaction

MYCOACHBUDDY

Thank you for visiting MYCOACHBUDDY website. This site contains brief information on company CEO Richard (Rick) Baker and the technology he invented going back to the early days of the Internet. If you are interested in AI personalization, developing new products and services for now emerging more “personalized” information digital age hopefully you may find this sites technology interesting.

To comprehend the advance in 1991 Rick Baker’s invention made for firth-generation computing, robotics and interactive computing one needs to think about how we as humans best interact with others, example an “expert” person and how they normally respond.  Have been doing so for 1000’s of years!

We normally provide an expert person with information in regards what we wish to achieve or learn and they normally respond by showing us things, which is “visual” and provide us with spoken commentary which is similar to “audio” advice.  But up until Rick’s 1991 invention nobody had perceived a multimedia system could allow a computer to
(a) capture information about what a person wished to achieve or learn via use of motion tracking camera sensors and
b) with the help of associated software in a database the computer could then automatically generate in near real-time, via the help of fast speed communications, a truly personalized “audio and visual” advice presentation for a person. which in-turn then allows a computer, in any format, a firth-generation, a humanoid robot or a computer associated with the delivery of interactive online services to respond to a persons requirements “audio-visually”, the way humans best learn.

You can see technical details in regards the workings of Rick’s invention via issued US Patent No 5,486,001. He has recently been issued with an advance on this earlier system via  US Patent No. 11210963 that now allows a computer to capture motion much more easily than first invention via use of what’s called IMU’s, tiny gyroscopes, accelerometers, magnetometers that can be woven into the likes of smart garment/clothing worn etc. The use of theses IMU’s allows the computer to produce for a person a superior “personalized instructional video”, which can be shown over the first invention.

Computer is able to “automatically” generate in “near real-time
(a) better visual showing a person how they are performing against a preferred way plus
(b) stream more accurate audio instructional comments and present it to a person on any screen.

Now you have what was initially thought of as just a hardware box able to communicate with us via our most effective way of learning – audio-visual and the presentation is truly personalized to a end users requirements.

This system can be used for potentially 100’s of applications streamed via fast speed communications to a persons viewable screen for direct viewing.

Currently those involved with interactive programming like home exercising are not programming the computer, example “virtual trainer” correctly to be fully interactive with a person and when they do produce a presentation that comes something close to the above system, to which the public responses extremely well to, companies do not realise the potential, make no further improvements to their programs, leave people with expensive hardware purchases and then move onto other less important developments. The public, AI & motion sensing and interactive communications suffers.

Note: A person or computer can offer the above near real-time personalized audio-visual instructional presentation without the need to physically see a person or object that maybe the source of enquiry.

Should you have interest in the system there is form

Thank you for looking!