The question is: Does God still have faith in US?

What does it mean when God asks us to have faith in Him? Is it simply that God wants to be reassured that we believe that He exists? (Whew! I thought there for a minute they were going to get rid if me!) Or is God asking for something from us beyond simply believing in His existence?

If we conclude that God does exist, does that belief change anything in the material world in which we live? Or is everything to do with faith just in hope of some great by-and-by, when we no longer exist in the natural world, because we have died and gone to a better or worse existence in the spiritual realm?

Or maybe it was only the pastors I listened to while growing up that stressed belief and faith. Maybe the ones you experienced were more concerned with death and hell, and faith was just an outward expression of our inward beliefs.

I didn't used to think about these kinds of questions back in the day when I listened to those kinds of preachers and teachers. I wonder if I was happier back then because I didn't think about it.

Like many others, I went through an extended period in my life when I did think about these things, probably because I had teachers who were busy trying to explain life, the universe and everything to me in purely natural terms, rather than the spiritual terms the Bible often uses to explain them. In the end, it is probably pretty easy to explain natural events to those who believe the natural world is all there is.

I was much older when I began to understand that God operates sometimes entirely in the spiritual realm, even when His actions appear to be taking place in the natural realm. Schools and universities at that time were trying to explain that the natural realm was the only realm that existed. They may not have come out and said it, but their underlying beliefs were that God did not exist, and they were therefore responsible for explaining how all the things that God could not have created actually now existed, at least in the natural realm.

They didn't much worry about the spiritual realm, since they didn't believe in anything they could not see, touch or feel. Thus they looked to the theory of evolution to explain life, and to the big bang theory to explain rocks and planets and stars and other stuff that we could actually see.

Many of us accepted this silliness, at least for a time, because the silliness was spoken with one voice by our educational culture. Belief in God's existence therefore declined, since "the science" offered alternate explanations of how creation and our existence actually came to be.

God, however, did not seem to need external validation of his existence, and He just kept working away, doing things. Sometimes He did them in the natural world, and sometimes He did them spiritually, in the supernatural world.

The part that many of us missed in going through this educational treadmill was that the spiritual and natural worlds were somehow connected, that actions in one realm could affect events in the other.

God had tried to tell us this earlier, but many of us worked hard to explain His world away. Churches even came up with "explanations" for events they could not explain naturally, such as healing and deliverance. The Scripture they used to explain the inexplicable away was often quoted as, "Oh, God doesn't do that anymore today." Thus the Bible was no longer true today, but it had been true at one time.

Interestingly enough, this put the churches in charge of determining which parts of the Bible, which is God's written words, was still true today. God's assurance that, "I am the same yesterday, today and tomorrow," fell upon deaf ears.

Given humanity's dwindling faith in God's very existence, why would any of us be concerned with whether God still has faith in us?

Perhaps one good reason we should care is that the world's end-time gospel has become, "Well, you know, everything is going to get worse and worse, and then the end will come!" By extrapolation, the church fails, God rescues His Believers from the world, and the devil wins the world and everyone left in it.

Not exactly the end times we were promised, is it? Where is the glorious church, without spot or wrinkle, the one the Gates of Hell will not prevail against, that we were to promised to be a part of? And why would God still have faith that this church was still the right Bride for His Beloved Son, Jesus Christ. The One the church teaches sacrificed all for us?

Our having faith in God is important, because – put bluntly – if He does not act, we are all screwed, and the elitists will have won the battle to control humanity. But given the stakes, have we given God good reason to have faith in us? If not, is it too late now to begin now?

Armageddon Story, the novels: CraigeMcMillan.com


Content created by the WND News Center is available for re-publication without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact licensing@wndnewscenter.org.

SUPPORT TRUTHFUL JOURNALISM. MAKE A DONATION TO THE NONPROFIT WND NEWS CENTER. THANK YOU!

The post The question is: Does God still have faith in US? appeared first on WND.

by is licensed under