Orthogonal polynomials appear in many areas of applied mathematics and in particular in numerical analysis. Within machine learning and optimization, typically, they provide natural basis functions that have good properties, they can be used to model various acceleration mechanisms. In a series of articles, some works have been done investigating the use of some (orthogonal) polynomials families in machine learning algorithm with applications such as facial expression recognition. In the majority of these articles, the rst order Chebyshev polynomials and the Berstein polynomials have been used.
Therefore the following questions arise:
1. Will the results in the literature still be valid if we use other (orthogonal) polynomial families? For example the Appell or the Bernouilli polynomials, the classical orthogonal polynomials of the continuous, the discrete or the q-discret variable, the q-Berstein polynomials?
2. Can we show that using other (orthogonal) polynomial families, the results can be improved, specically, in the application to machine learning or neural network for example?
3. Will the multidimensional Berstein polynomials theory still be valid or be improved if we use for example multivariate orthogonal polynomials, multivariate q-Berstein polynomials to be defined, or other families of polynomials?
The main aim of my visit will be:1. to find more works in relation with application of orthogonal polynomials in machine learning and/or neural networks.
2. to go through the literature and see what has been done so far to make a bridge between orthogonal polynomials and machine learning and/or neural networks.
3. to give possible answers to the above mentioned questions.
4. to use our good knowledge of orthogonal polynomials to find more applications of the classical orthogonal polynomials of the continuous, the discrete and the q-discrete variable in the field of artificial intelligence.