I apologize for the outage on the site yesterday and today. Lamar University is in Beaumont Texas and Hurricane Laura came through here and caused a brief power outage at Lamar. Things should be up and running at this point and (hopefully) will stay that way, at least until the next hurricane comes through here which seems to happen about once every 10-15 years. Note that I wouldn't be too suprised if there are brief outages over the next couple of days as they work to get everything back up and running properly. I apologize for the inconvienence.

Paul

August 27, 2020

*i.e.*you are probably on a mobile phone). Due to the nature of the mathematics on this site it is best views in landscape mode. If your device is not in landscape mode many of the equations will run off the side of your device (should be able to scroll to see them) and some of the menu items will be cut off due to the narrow screen width.

### Section 7-4 : Variation of Parameters

We now need to take a look at the second method of determining a particular solution to a differential equation. As we did when we first saw Variation of Parameters we’ll go through the whole process and derive up a set of formulas that can be used to generate a particular solution.

However, as we saw previously when looking at 2^{nd} order differential equations this method can lead to integrals that are not easy to evaluate. So, while this method can always be used, unlike Undetermined Coefficients, to at least write down a formula for a particular solution it is not always going to be possible to actually get a solution.

So let’s get started on the process. We’ll start with the differential equation,

\[\begin{equation}{y^{\left( n \right)}} + {p_{n - 1}}\left( t \right){y^{\left( {n - 1} \right)}} + \cdots + {p_1}\left( t \right)y' + {p_0}\left( t \right)y = g\left( t \right)\label{eq:eq1}\end{equation}\]and assume that we’ve found a fundamental set of solutions, \({y_1}\left( t \right),{y_2}\left( t \right), \ldots ,{y_n}\left( t \right)\), for the associated homogeneous differential equation.

Because we have a fundamental set of solutions to the homogeneous differential equation we now know that the complementary solution is,

\[y\left( t \right) = {c_1}{y_1}\left( t \right) + {c_2}{y_2}\left( t \right) + \cdots + {c_n}{y_n}\left( t \right)\]The method of variation of parameters involves trying to find a set of new functions, \({u_1}\left( t \right),{u_2}\left( t \right), \ldots ,{u_n}\left( t \right)\) so that,

\[\begin{equation}Y\left( t \right) = {u_1}\left( t \right){y_1}\left( t \right) + {u_2}\left( t \right){y_2}\left( t \right) + \cdots + {u_n}\left( t \right){y_n}\left( t \right)\label{eq:eq2}\end{equation}\]will be a solution to the nonhomogeneous differential equation. In order to determine if this is possible, and to find the \({u_i}\left( t \right)\) if it is possible, we’ll need a total of \(n\) equations involving the unknown functions that we can (hopefully) solve.

One of the equations is easy. The guess, \(\eqref{eq:eq2}\), will need to satisfy the original differential equation, \(\eqref{eq:eq1}\). So, let’s start taking some derivatives and as we did when we first looked at variation of parameters we’ll make some assumptions along the way that will simplify our work and in the process generate the remaining equations we’ll need.

The first derivative of \(\eqref{eq:eq2}\) is,

\[Y'\left( t \right) = {u_1}{y'_1} + {u_2}{y'_2} + \cdots + {u_n}{y'_n} + {u'_1}{y_1} + {u'_2}{y_2} + \cdots + {u'_n}{y_n}\]Note that we rearranged the results of the differentiation process a little here and we dropped the \(\left( t \right)\) part on the \(u\) and \(y\) to make this a little easier to read. Now, if we keep differentiating this it will quickly become unwieldy and so let’s make as assumption to simplify things here. Because we are after the \({u_i}\left( t \right)\) we should probably try to avoid letting the derivatives on these become too large. So, let’s make the assumption that,

\[{u'_1}{y_1} + {u'_2}{y_2} + \cdots + {u'_n}{y_n} = 0\]The natural question at this point is does this even make sense to do? The answer is, if we end up with a system of \(n\) equations that we can solve for the \({u_i}\left( t \right)\) then yes it does make sense to do. Of course, the other answer is, we wouldn’t be making this assumption if we didn’t know that it was going to work. However, to accept this answer requires that you trust us to make the correct assumptions so maybe the first answer is the best at this point.

At this point the first derivative of \(\eqref{eq:eq2}\) is,

\[Y'\left( t \right) = {u_1}{y'_1} + {u_2}{y'_2} + \cdots + {u_n}{y'_n}\]and so we can now take the second derivative to get,

\[Y''\left( t \right) = {u_1}{y''_1} + {u_2}{y''_2} + \cdots + {u_n}{y''_n} + {u'_1}{y'_1} + {u'_2}{y'_2} + \cdots + {u'_n}{y'_n}\]This looks an awful lot like the original first derivative prior to us simplifying it so let’s again make a simplification. We’ll again want to keep the derivatives on the \({u_i}\left( t \right)\) to a minimum so this time let’s assume that,

\[{u'_1}{y'_1} + {u'_2}{y'_2} + \cdots + {u'_n}{y'_n} = 0\]and with this assumption the second derivative becomes,

\[Y''\left( t \right) = {u_1}{y''_1} + {u_2}{y''_2} + \cdots + {u_n}{y''_n}\]Hopefully you’re starting to see a pattern develop here. If we continue this process for the first \(n - 1\) derivatives we will arrive at the following formula for these derivatives.

\[\begin{equation}{Y^{\left( k \right)}}\left( t \right) = {u_1}y_1^{\left( k \right)} + {u_2}y_2^{\left( k \right)} + \cdots + {u_n}y_n^{\left( k \right)}\hspace{0.25in}\hspace{0.25in}k = 1,2, \ldots ,n - 1\label{eq:eq3}\end{equation}\]To get to each of these formulas we also had to assume that,

\[\begin{equation}{u'_1}y_1^{\left( k \right)} + {u'_2}y_2^{\left( k \right)} + \cdots + {u'_n}y_n^{\left( k \right)} = 0\hspace{0.25in}\hspace{0.25in}\hspace{0.25in}k = 0,1, \ldots n - 2\label{eq:eq4}\end{equation}\]and recall that the 0^{th} derivative of a function is just the function itself. So, for example, \(y_2^{\left( 0 \right)}\left( t \right) = {y_2}\left( t \right)\).

Notice as well that the set of assumptions in \(\eqref{eq:eq4}\) actually give us \(n - 1\) equations in terms of the derivatives of the unknown functions : \({u_1}\left( t \right),{u_2}\left( t \right), \ldots ,{u_n}\left( t \right)\).

All we need to do then is finish generating the first equation we started this process to find (*i.e.* plugging \(\eqref{eq:eq2}\) into \(\eqref{eq:eq1}\)). To do this we’ll need one more derivative of the guess. Differentiating the \({\left( {n - 1}
\right)^{{\mbox{st}}}}\) derivative, which we can get from \(\eqref{eq:eq3}\), to get the \(n\)^{th} derivative gives,

This time we’ll also not be making any assumptions to simplify this but instead just plug this along with the derivatives given in \(\eqref{eq:eq3}\) into the differential equation, \(\eqref{eq:eq1}\)

\[\begin{align*}{u_1}y_1^{\left( n \right)} + {u_2}y_2^{\left( n \right)} + \cdots + {u_n}y_n^{\left( n \right)} + {{u'}_1}y_1^{\left( {n - 1} \right)} + {{u'}_2}y_2^{\left( {n - 1} \right)} + \cdots + {{u'}_n}y_n^{\left( {n - 1} \right)} & + \\ {p_{n - 1}}\left( t \right)\left[ {{u_1}y_1^{\left( {n - 1} \right)} + {u_2}y_2^{\left( {n - 1} \right)} + \cdots + {u_n}y_n^{\left( {n - 1} \right)}} \right] & + \\ \vdots \hspace{1.1in} & \\ {p_1}\left( t \right)\left[ {{u_1}{{y'}_1} + {u_2}{{y'}_2} + \cdots + {u_n}{{y'}_n}} \right] & + \\ {p_0}\left( t \right)\left[ {{u_1}{y_1} + {u_2}{y_2} + \cdots + {u_n}{y_n}} \right] & = g\left( t \right)\end{align*}\]Next, rearrange this a little to get,

\[\begin{align*}{u_1}\left[ {y_1^{\left( n \right)} + {p_{n - 1}}\left( t \right)y_1^{\left( {n - 1} \right)} + \cdots + {p_1}\left( t \right){{y'}_1} + {p_0}\left( t \right){y_1}} \right] & + \\ {u_2}\left[ {y_2^{\left( n \right)} + {p_{n - 1}}\left( t \right)y_2^{\left( {n - 1} \right)} + \cdots + {p_1}\left( t \right){{y'}_2} + {p_0}\left( t \right){y_2}} \right] & + \\ \vdots \hspace{2.0in} & \\ {u_n}\left[ {y_n^{\left( n \right)} + {p_{n - 1}}\left( t \right)y_n^{\left( {n - 1} \right)} + \cdots + {p_1}\left( t \right){{y'}_n} + {p_0}\left( t \right){y_n}} \right] & + \\ {{u'}_1}y_1^{\left( {n - 1} \right)} + {{u'}_2}y_2^{\left( {n - 1} \right)} + \cdots + {{u'}_n}y_n^{\left( {n - 1} \right)} & = g\left( t \right)\end{align*}\]Recall that \({y_1}\left( t \right),{y_2}\left( t \right), \ldots ,{y_n}\left( t \right)\) are all solutions to the homogeneous differential equation and so all the quantities in the \(\left[ {\,\,} \right]\) are zero and this reduces down to,

\[{\kern 1pt} {u'_1}y_1^{\left( {n - 1} \right)} + {u'_2}y_2^{\left( {n - 1} \right)} + \cdots + {u'_n}y_n^{\left( {n - 1} \right)} = g\left( t \right)\]So, this equation, along with those given in \(\eqref{eq:eq4}\), give us the \(n\) equations that we needed. Let’s list them all out here for the sake of completeness.

\[\begin{align*}{{u'}_1}{y_1} + {{u'}_2}{y_2} + \cdots + {{u'}_n}{y_n} & = 0\\ {{u'}_1}{{y'}_1} + {{u'}_2}{{y'}_2} + \cdots + {{u'}_n}{{y'}_n} & = 0\\ {{u'}_1}{{y''}_1} + {{u'}_2}{{y''}_2} + \cdots + {{u'}_n}{{y''}_n} & = 0\\ \vdots \hspace{0.85in} & \\ {{u'}_1}y_1^{\left( {n - 2} \right)} + {{u'}_2}y_2^{\left( {n - 2} \right)} + \cdots + {{u'}_n}y_n^{\left( {n - 2} \right)} & = 0\\ {{u'}_1}y_1^{\left( {n - 1} \right)} + {{u'}_2}y_2^{\left( {n - 1} \right)} + \cdots + {{u'}_n}y_n^{\left( {n - 1} \right)} & = g\left( t \right)\end{align*}\]So, we’ve got \(n\) equations, but notice that just like we got when we did this for 2^{nd} order differential equations the unknowns in the system are not \({u_1}\left( t \right),{u_2}\left( t \right), \ldots ,{u_n}\left( t \right)\) but instead they are the derivatives, \({u'_1}\left( t \right),{u'_2}\left( t \right), \ldots ,{u'_n}\left( t \right)\). This isn’t a major problem however. Provided we can solve this system we can then just integrate the solutions to get the functions that we’re after.

Also, recall that the \({y_1}\left( t \right),{y_2}\left( t \right), \ldots ,{y_n}\left( t \right)\) are assumed to be known functions and so they along with their derivatives (which appear in the system) are all known quantities in the system.

Now, we need to think about how to solve this system. If there aren’t too many equations we can just solve it directly if we want to. However, for large \(n\) (and it won’t take much to get large here) that could be quite tedious and prone to error and it won’t work at all for general \(n\) as we have here.

The best solution method to use at this point is then Cramer’s Rule. We’ve used Cramer’s Rule several times in this course, but the best reference for our purposes here is when we used it when we first defined Fundamental Sets of Solutions back in the 2^{nd} order material.

Upon using Cramer’s Rule to solve the system the resulting solution for each \({u'_i}\) will be a quotient of two determinants of \(n\) x \(n\) matrices. The denominator of each solution will be the determinant of the matrix of the known coefficients,

\[\left| {\begin{array}{*{20}{c}}{{y_1}}&{{y_2}}& \cdots &{{y_n}}\\{{{y'}_1}}&{{{y'}_2}}& \cdots &{{{y'}_n}}\\ \vdots & \vdots & \ddots & \vdots \\{y_1^{\left( {n - 1} \right)}}&{y_2^{\left( {n - 1} \right)}}& \cdots &{y_n^{\left( {n - 1} \right)}}\end{array}} \right| = W\left( {{y_1},{y_2}, \ldots {y_n}} \right)\left( t \right)\]This however, is just the Wronskian of \({y_1}\left( t \right),{y_2}\left( t \right), \ldots ,{y_n}\left( t \right)\) as noted above and because we have assumed that these form a fundamental set of solutions we also know that the Wronskian will not be zero. This in turn tells us that the system above is in fact solvable and all of the assumptions we apparently made out of the blue above did in fact work.

The numerators of the solution for \({u'_i}\) will be the determinant of the matrix of coefficients with the \(i\)^{th} column replaced with the column \(\left( {0,0,0, \ldots ,0,g\left( t \right)} \right)\). For example, the numerator for the first one, \({u'_1}\) is,

Now, by a nice property of determinants if we factor something out of one of the columns of a matrix then the determinant of the resulting matrix will be the factor times the determinant of new matrix. In other words, if we factor \(g\left( t \right)\) out of this matrix we arrive at,

\[\left| {\begin{array}{*{20}{c}}0&{{y_2}}& \cdots &{{y_n}}\\0&{{{y'}_2}}& \cdots &{{{y'}_n}}\\ \vdots & \vdots & \ddots & \vdots \\{g\left( t \right)}&{y_2^{\left( {n - 1} \right)}}& \cdots &{y_n^{\left( {n - 1} \right)}}\end{array}} \right| = g\left( t \right)\left| {\begin{array}{*{20}{c}}0&{{y_2}}& \cdots &{{y_n}}\\0&{{{y'}_2}}& \cdots &{{{y'}_n}}\\ \vdots & \vdots & \ddots & \vdots \\1&{y_2^{\left( {n - 1} \right)}}& \cdots &{y_n^{\left( {n - 1} \right)}}\end{array}} \right|\]We did this only for the first one, but we could just as easily done this with any of the \(n\) solutions. So, let \({W_i}\) represent the determinant we get by replacing the \(i\)^{th} column of the Wronskian with the column \(\left( {0,0,0, \ldots ,0,1} \right)\) and the solution to the system can then be written as,

Wow! That was a lot of effort to generate and solve the system but we’re almost there. With the solution to the system in hand we can now integrate each of these terms to determine just what the unknown functions, \({u_1}\left( t \right),{u_2}\left( t \right), \ldots ,{u_n}\left( t \right)\) we’ve after all along are.

\[{u_1} = \int{{\frac{{g\left( t \right){W_1}\left( t \right)}}{{W\left( t \right)}}\,dt}},\,\,\,\,\,\,{u_2} = \int{{\frac{{g\left( t \right){W_2}\left( t \right)}}{{W\left( t \right)}}\,dt}},\,\,\,\,\,\, \cdots ,\,\,\,\,\,{u_n} = \int{{\frac{{g\left( t \right){W_n}\left( t \right)}}{{W\left( t \right)}}\,dt}}\]Finally, a particular solution to \(\eqref{eq:eq1}\) is then given by,

\[Y\left( t \right) = {y_1}\left( t \right)\int{{\frac{{g\left( t \right){W_1}\left( t \right)}}{{W\left( t \right)}}\,dt}} + {y_2}\left( t \right)\int{{\frac{{g\left( t \right){W_2}\left( t \right)}}{{W\left( t \right)}}\,dt}} + \cdots + {y_n}\left( t \right)\int{{\frac{{g\left( t \right){W_n}\left( t \right)}}{{W\left( t \right)}}\,dt}}\]We should also note that in the derivation process here we assumed that the coefficient of the \({y^{\left( n \right)}}\) term was a one and that has been factored into the formula above. If the coefficient of this term is not one then we’ll need to make sure and divide it out before trying to use this formula.

Before we work an example here we really should note that while we can write this formula down actually computing these integrals may be all but impossible to do.

Okay let’s take a look at a quick example.

The characteristic equation is,

\[{r^3} - 2{r^2} - 21r - 18 = \left( {r + 3} \right)\left( {r + 1} \right)\left( {r - 6} \right) = 0\hspace{0.25in}\hspace{0.25in} \Rightarrow \hspace{0.25in}{r_{\,1}} = - 3,\,\,\,{r_{\,2}} = - 1,\,\,\,{r_{\,3}} = 6\]So, we have three real distinct roots here and so the complimentary solution is,

\[{y_c}\left( t \right) = {c_1}{{\bf{e}}^{ - 3t}} + {c_2}{{\bf{e}}^{ - t}} + {c_3}{{\bf{e}}^{6t}}\]Okay, we’ve now got several determinants to compute. We’ll leave it to you to verify the following determinant computations.

\[\begin{align*} W & = \left| {\begin{array}{*{20}{c}}{{{\bf{e}}^{ - 3t}}}&{{{\bf{e}}^{ - t}}}&{{{\bf{e}}^{6t}}}\\{ - 3{{\bf{e}}^{ - 3t}}}&{ - {{\bf{e}}^{ - t}}}&{6{{\bf{e}}^{6t}}}\\{9{{\bf{e}}^{ - 3t}}}&{{{\bf{e}}^{ - t}}}&{36{{\bf{e}}^{6t}}}\end{array}} \right| = 126{{\bf{e}}^{2t}} & \hspace{0.25in}{W_1} & = \left| {\begin{array}{*{20}{c}}0&{{{\bf{e}}^{ - t}}}&{{{\bf{e}}^{6t}}}\\0&{ - {{\bf{e}}^{ - t}}}&{6{{\bf{e}}^{6t}}}\\1&{{{\bf{e}}^{ - t}}}&{36{{\bf{e}}^{6t}}}\end{array}} \right| = 7{{\bf{e}}^{5t}}\\ {W_2} & = \left| {\begin{array}{*{20}{c}}{{{\bf{e}}^{ - 3t}}}&0&{{{\bf{e}}^{6t}}}\\{ - 3{{\bf{e}}^{ - 3t}}}&0&{6{{\bf{e}}^{6t}}}\\{9{{\bf{e}}^{ - 3t}}}&1&{36{{\bf{e}}^{6t}}}\end{array}} \right| = - 9{{\bf{e}}^{3t}} & \hspace{0.25in} {W_3} & = \left| {\begin{array}{*{20}{c}}{{{\bf{e}}^{ - 3t}}}&{{{\bf{e}}^{ - t}}}&0\\{ - 3{{\bf{e}}^{ - 3t}}}&{ - {{\bf{e}}^{ - t}}}&0\\{9{{\bf{e}}^{ - 3t}}}&{{{\bf{e}}^{ - t}}}&1\end{array}} \right| = 2{{\bf{e}}^{ - 4t}}\end{align*}\]Now, given that \(g\left( t \right) = 3 + 4{{\bf{e}}^{ - t}}\) we can compute each of the \({u_i}\). Here are those integrals.

\[{u_1} = \int{{\frac{{\left( {3 + 4{{\bf{e}}^{ - t}}} \right)\left( {7{{\bf{e}}^{5t}}} \right)}}{{126{{\bf{e}}^{2t}}}}\,dt}} = \frac{1}{{18}}\int{{3{{\bf{e}}^{3t}} + 4{{\bf{e}}^{2t}}\,dt}} = \frac{1}{{18}}\left( {{{\bf{e}}^{3t}} + 2{{\bf{e}}^{2t}}} \right)\] \[{u_2} = \int{{\frac{{\left( {3 + 4{{\bf{e}}^{ - t}}} \right)\left( { - 9{{\bf{e}}^{3t}}} \right)}}{{126{{\bf{e}}^{2t}}}}\,dt}} = - \frac{1}{{14}}\int{{3{{\bf{e}}^t} + 4\,dt}} = - \frac{1}{{14}}\left( {3{{\bf{e}}^t} + 4t} \right)\] \[{u_3} = \int{{\frac{{\left( {3 + 4{{\bf{e}}^{ - t}}} \right)\left( {2{{\bf{e}}^{ - 4t}}} \right)}}{{126{{\bf{e}}^{2t}}}}\,dt}} = \frac{1}{{63}}\int{{3{{\bf{e}}^{ - 6t}} + 4{{\bf{e}}^{ - 7t}}\,dt}} = \frac{1}{{63}}\left( { - \frac{1}{2}{{\bf{e}}^{ - 6t}} - \frac{4}{7}{{\bf{e}}^{ - 7t}}} \right)\]Note that we didn’t include the constants of integration in each of these because including them would just have introduced a term that would get absorbed into the complementary solution just as we saw when we were dealing with 2^{nd} order differential equations.

Finally, a particular solution for this differential equation is then,

\[\begin{align*}{Y_P} & = {u_1}{y_1} + {u_2}{y_2} + {u_3}{y_3}\\ & = \frac{1}{{18}}\left( {{{\bf{e}}^{3t}} + 2{{\bf{e}}^{2t}}} \right){{\bf{e}}^{ - 3t}} - \frac{1}{{14}}\left( {3{{\bf{e}}^t} + 4t} \right){{\bf{e}}^{ - t}} + \frac{1}{{63}}\left( { - \frac{1}{2}{{\bf{e}}^{ - 6t}} - \frac{4}{7}{{\bf{e}}^{ - 7t}}} \right){{\bf{e}}^{6t}}\\ & = - \frac{1}{6} + \frac{5}{{49}}{{\bf{e}}^{ - t}} - \frac{2}{7}t{{\bf{e}}^{ - t}}\end{align*}\]The general solution is then,

\[y\left( t \right) = {c_1}{{\bf{e}}^{ - 3t}} + {c_2}{{\bf{e}}^{ - t}} + {c_3}{{\bf{e}}^{6t}} - \frac{1}{6} + \frac{5}{{49}}{{\bf{e}}^{ - t}} - \frac{2}{7}t{{\bf{e}}^{ - t}}\]We’re only going to do a single example in this section to illustrate the process more than anything so with that we’ll close out this section.