Gist

Here’s a fun thing to try out with TensorFlow.js: take whatever function you like and let a network learn it. A famous theorem shows that any function can be learned by a network. No surprise, any function can be reproduced by a Fourier expansion too. Given enough parametric space and a limited interval it’s conceptually obvious yet somewhat more tricky to prove it rigorously.

The code uses the cosine function and some Plotly library, the essence really is the layering of dense connections and the initialization.

The TensorFlow did a great job in ensuring API similarity. So, if you’re familiar with the Python API this kinda code can be assembled in a meeting.