Idea:
Automatically generate palm reading using edge detection to find relevant palm lines. Then compare palm lines to values in a palm reading guide, and spit out the results.
To-Do: Everything
- Use Handtrack.js to isolate hand from image (or webcam feed).
- Use Edge detection to turn isolated hand into a black and white image, with the palm lines visible.
- Somehow identify major palm lines (heart, brain and life apparently), and measure the position, length and curvature of these lines (oh jeeze). Might have to literally train a neural network to accomplish this, but I'll probably abandon this idea if there's no pre-exisiting database / pre-trained model that I can use.
- Fancy graphic over palm image used, to show these lines (coloured lines over the major palm lines).
- Generate predictions using the measured variables.