Revision as of 17:43, 5 October 2008 by Jhunsber (Talk)

$ g(x+y) = \frac{g(x)+g(y)}{1-g(x)g(y)} $

$ \lim_{h \to 0} g(h) = 0 $

$ \lim_{h \to 0} \frac{g(h)}{h}= 1 $

a. Show that $ g(0) = 0 $.

b. Show that $ g'(x) = 1 + [g(x)]^2 $

c. Find $ g(x) $ by solving the differential equation in part (b).


Anyone know where to start? I'm defeated at every turn; I can't break the function into even/odd portion that have any use and none of the laws of exponentials/logarithms seem to be very useful. The only fact I can pull out is that $ g'(0)=1 $ which can be determined through L'Hopitals.

--Jmason 15:28, 5 October 2008 (UTC)


  • You can show g(0) = 0 by solving for g(x) (Yes, you can do it. No, it's not that hard), and then plugging 0 in for x. As for the other parts, I haven't got that far yet. I'll see what I get. And wow, I've been working on this problem a half hour already, I think.

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal