Click here to Skip to main content
15,792,540 members
Articles / Artificial Intelligence / Machine Learning

How to Setup Learning Rate Per Iteration in CTNK using C#

Rate me:
Please Sign up or sign in to vote.
5.00/5 (3 votes)
22 Nov 2017CPOL2 min read 10.5K   1   8
How to setup learning rate per iteration in CTNK using C#

So far, we have seen how to train and validate models in CNTK using C#. Also, there many more details which should be revealed in order to better understand the CNTK library. One of the important features not only in the CNTK but also in every DNN (deep neural networks) is the learning rate.

In ANN, the learning rate is the number by which the derivative is multiplied before it is subtracted by the weight. If the weight is decreased too much, the loss function will be increased and the network will diverge. On the other hand, if the weight is decreased too little, the loss function will be changed little and the convergent progress would go to slowly. So selecting the right value of the parameter is important. During the training process, the learning rate is usually defined as constant value. In CNTK, the learning rate is defined as follows:

// set learning rate for the network
var learningRate = new TrainingParameterScheduleDouble(0.2, 1);

From the code above, the learning rate is assigned to 0.2 value per sample. This means that the whole training process will be done with the learning rate of 0.2.

The CNTK supports dynamic changing of the learning rate.
Assume we want to setup different learning rates so that from the first to the 100 iterations, the learning rate would be 0.2. From the 100 to 500 iterations, we want the learning rate to be 0.1. Moreover, after the 500 iterations are completed and to the end of the iteration process, we want to setup the learning rate to 0.05.

Above said can be expressed:

lr1=0.2 , from 1 to 100 iterations
lr2= 0.1 from 100 to 500 iterations
lr3= 0.05 from 500 to the end of the searching process.

In case we want to setup the learning rate dynamically, we need to use the PairSizeTDouble class in order to define the learning rate. So for the above requirements, the flowing code should be implemented:

PairSizeTDouble p1 = new PairSizeTDouble(2, 0.2);
PairSizeTDouble p2 = new PairSizeTDouble(10, 0.1);
PairSizeTDouble p3 = new PairSizeTDouble(1, 0.05);

var vp = new VectorPairSizeTDouble() { p1, p2, p3 };
var learningRatePerSample = new CNTK.TrainingParameterScheduleDouble(vp, 50);

First, we need to define PairSizeTDouble object for every learning rate value, with the integer number which will be multiplied.

Once we define the rates, make an array of rate values by creating the VectorPairSizeTDouble object. Then the array is passed as the first argument in the TrainingParameterScheduleDouble method. The second argument of the method is multiplication number. So in the first rate value, the 2 is multiple with 50 which is 100, and denotes the iteration number. Similar multiplication is done in the other rate values.

Filed under: .NET, C#, CNTK, CodeProject
Tagged: C#, CNTK, CodeProject, Machine Learning


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Written By
Software Developer (Senior)
Bosnia and Herzegovina Bosnia and Herzegovina
Bahrudin Hrnjica holds a Ph.D. degree in Technical Science/Engineering from University in Bihać.
Besides teaching at University, he is in the software industry for more than two decades, focusing on development technologies e.g. .NET, Visual Studio, Desktop/Web/Cloud solutions.

He works on the development and application of different ML algorithms. In the development of ML-oriented solutions and modeling, he has more than 10 years of experience. His field of interest is also the development of predictive models with the ML.NET and Keras, but also actively develop two ML-based .NET open source projects: GPdotNET-genetic programming tool and ANNdotNET - deep learning tool on .NET platform. He works in multidisciplinary teams with the mission of optimizing and selecting the ML algorithms to build ML models.

He is the author of several books, and many online articles, writes a blog at, regularly holds lectures at local and regional conferences, User groups and Code Camp gatherings, and is also the founder of the Bihac Developer Meetup Group. Microsoft recognizes his work and awarded him with the prestigious Microsoft MVP title for the first time in 2011, which he still holds today.

Comments and Discussions

QuestionHow is 'iteration' defined? Pin
Faisal Waris7-Jul-18 5:49
Faisal Waris7-Jul-18 5:49 
QuestionSuggestion for another article - Saving and reusing the model? Pin
asiwel23-Nov-17 9:22
professionalasiwel23-Nov-17 9:22 
AnswerRe: Suggestion for another article - Saving and reusing the model? Pin
Bahrudin Hrnjica24-Nov-17 6:14
professionalBahrudin Hrnjica24-Nov-17 6:14 
GeneralRe: Suggestion for another article - Saving and reusing the model? Pin
asiwel24-Nov-17 19:54
professionalasiwel24-Nov-17 19:54 
GeneralRe: Suggestion for another article - Saving and reusing the model? Pin
Bahrudin Hrnjica25-Nov-17 5:41
professionalBahrudin Hrnjica25-Nov-17 5:41 
GeneralRe: Suggestion for another article - Anticipating that! Pin
asiwel25-Nov-17 8:18
professionalasiwel25-Nov-17 8:18 
PraiseThis works for setting up a momentum schedule too. Pin
asiwel23-Nov-17 8:55
professionalasiwel23-Nov-17 8:55 
Hi, Bahrudin. Excellent follow-up article! Works splendidly.

This note follows up our conversation in your previous article at:
Testing and Validation CNTK Models using C#

To use the MomentumSGDLearner() method, you can set a momentum rate schedule the same way you have shown to set a learning rate schedule, like this:
  PairSizeTDouble m1 = new PairSizeTDouble(500, 0.005);
  PairSizeTDouble m2 = new PairSizeTDouble(500, 0.001);
  PairSizeTDouble m3 = new PairSizeTDouble(1, 0.0005);

  var vm = new VectorPairSizeTDouble() { m1, m2, m3 };
  var momentumSchedulePerSample =
      new CNTK.TrainingParameterScheduleDouble(vm, 1);
  var myLearner = Learner.MomentumSGDLearner(ffnn_model.Parameters(),
      learningRatePerSample, momentumSchedulePerSample, true);

Another way to set momentum that I saw in an example is:
var momentumSchedulePerSample = CNTKLib.MomentumAsTimeConstantSchedule(256);

That works too.

PS / In the 2nd paragraph, 3rd sentence, I think you mean "...convergent progress" rather than "...divergent progress" would go too slowly.

modified 23-Nov-17 14:03pm.

GeneralRe: This works for setting up a momentum schedule too. Pin
Bahrudin Hrnjica24-Nov-17 6:24
professionalBahrudin Hrnjica24-Nov-17 6:24 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.