Saturday, January 26, 2019


For the most part we only have a theory on how planets are formed and even how these planets magnetic fields are generated. even earths plate-tectonics are only a theory, granted though this theory seems a very fitting model to what we see and observe.

However there are problems with some of these theory's for a start at the time of writing we have earths magnetic field at about 11.05 degrees off alignment with earth's physical axial rotation. This also appears to be altering at an increasing rate, we have found nothing that is satisfiable in current day physics to explain how this could be. Current theory dictates that the earth's magnetic field is generated by electrically conductive materials (molten iron core) within the earth which generate the magnetic field via its spin. This theory would be sound enough except for the 11.05 degrees asymmetric problem. If its spin that generates the feild then this should cause a relitive spin within the aearth more fluid matter to be in relitive alinement with the physical axis and spin It should be inline with the physical spin.

Progress is being made though,the basics of current theory assumes that planet formation is simultaneous with formation of its parent star. how this is thought to occur is that the huge gas cloud is disturbed , as a result of this it leads up to a gravitational collapse and eventually increases in spin, this eventually forms a hot dense centre known as a proto ‘parent’ star - leaving behind a thin cooling disc of gas. As the disc gets thinner, particles begin to collapse in and eventually become planets and those planets also end up with moons. it may sound reasonable and good , however we typically only see a final fully formed planet and as a result it is difficult to get a complete picture of the formation process and test these theories.

However this is changing,a team of research scientists were able to obtain the first ever image of an exoplanet in the process of forming. To achieve this the scientists used the Spectro-Polarimetric High-contrast Exoplanet Research (SPHERE) instrument, on the European Southern Observatory’s Very Large Telescope (VLT), which features a coronagraph to block out the stars light, creating an artificial eclipse, and revealing the faint corona and any orbiting planetary bodies. The observations revealed the presence of a new-born gas giant orbiting its parent dwarf star PDS 70 scientists were able to photograph and determine its distance from the parent star to be 1.9 million miles. Further analysis showed that the temperature of this baby planet (~ 1200 Kelvin) is much hotter than fully formed planets which is what to be expected and inline with current theory.

This research has huge implications, as now for the first time, scientists are able to observe and study the different stages of stellar formation. This will hopefully lead to new insights and a better understanding of the key processes involved in planetary and solar system formation.








DARPA’s history and drive appears to date back to the 1970's when they saw many vulnerabilities of U.S. aircraft and their on-board equipment was easy in terms of detection and attack by potential enemies, who were deploying new SAM'S (surface-to-air missiles) technology , These systems integrated radar-guided and air-launched radar-guided missiles, all networked with early warning, acquisition, and targeting radars, and coördinated within sophisticated command and control frameworks.


Because of such threat to security DARPA embarked on a program to develop strategies and technologies for reducing radar detectability,this also included technologies that changed more conventional shapes of aircraft to something much more reductive to radar detection through a combination of shaping (to minimize the number of radar return spikes) but it didn't just stop at that the list goes on with radar absorbent materials; infrared shielding, exhaust cooling and shaping, and enhanced heat dissipation; reduced visual signatures; active signature cancellation; inlet shielding; and windshield coatings.


all this development eventually led to a project named 'HAVE Blue', this was the first practical combat stealth aircraft, after making its first test flight by the end of 1977. The test and refinements continued leading to the realization of of the F-117A stealth fighter, which became operational in October 1983. In the follow-on development, the TACIT Blue aircraft, could operate radar sensors while maintaining its own low radar cross-section. This also inspired development of the B-2 stealth bomber.

haveblue prototype.jpeg

LOCKHEED MARTIN HAVE Blue f117 prototype

f117 stealth fighter, aka nighthawk 


Since their inception these stealth aircraft have destroyed key targets in conflicts in Iraq, both in the 1991 'Desert Storm operation' and in 2003 during 'Operation Iraqi Freedom'; in Afghanistan during Operation Enduring Freedom in 2001; and in Libya in 2011



Many new materials have been developed for the digital tech industry , as a result we now have flexible surfaces that display graphics. These even have programed responses to change the display according to how the surface is bent. Although we think compared to what is possible some of these are still crude that we have seen for the general public.

We are certain that DARPA (Defense And Research Projects Agency) along with many corporative interest want to make the ultimate smart lens,. That is if they haven't got it all perfected it already, as there are some variants that do exist for sure , these new materials mentioned will make this a reality and one day a common thing. GOOGLE glass caused a bit of a stir and then somewhat died out mainly due to people's concerns with privacy. It is used however in professions more than general public, this hasn't stopped them trying again with other ideas on owning your iris though. The ultimate goal is to have a wearable contact lens that actually over lays visual images and text like a computer screen.

Most of that same tech in google glass is possible to embed into a lens with the right materials, although it needs to be cost effective , infact its likely to have already happened inside programmes from DARPA for some time now. Of course this is a bold claim and we don't pretend to know for sure. however we do have news that the one company that has been the most aggressive about bringing electronics to contact lenses is ' Verily '. , which was actually a sub division of Alphabet who google is also a part of.

(we hope your learning something here this split up in 2014 into many companies under the Alphabet umbrella. Verily Life Sciences is basically the Alphabet independent subsidiary that's developing advanced contact lenses.)

in 2015 we had news that the latest Verily smart contact lens is actually injected into the eyeball, according to a published patent. So it's less of a contact lens and more of a surgical implant. in this technology your natural lens is actually removed from your eyeball. A fluid is then injected into the eye, and that fluid fuses with the eye's lens capsule as it solidifies. Inside this new, artificial lens lives storage, battery, sensors, a radio and other electronics. The artificial lens would take over the job of focusing light onto the retina, improving vision in numerous ways without glasses, but in a flexible, interactive way.

Verily is headed by Andrew Jason Conrad, who is also the inventor of the lens.

Verily is also working with Swiss drug maker Novartis to make and sell smart contact lenses to help people with diabetes track their blood glucose levels. they have also been awarded a patent for a solar-powered contact lens technology. one of the applications for smart contact lenses is the improvement of vision without glasses.

in 2015 we also had news that researchers at the University of Wisconsin, Madison, have invented a smart contact lens that can instantly focus the eye. it was said to be under development as a future product for an Israeli company called Deep Optics. The idea is based on the eye functions of the elephant nose fish. The lens uses electronic circuits and light sensors, which are powered by a solar cell, all built into the contact lens. When the sensors determine that the eye needs to focus, the chips command a small electrical current, this current changes the focal length of the lens in a fraction of a second. when we had this news the final development for a finished product was said to be 5 years away.

The University of Michigan scientists have also worked on building a contact lens that can give soldiers and others the ability to see in the dark using thermal imaging. The technology uses graphene, to pick up the full spectrum of light, including ultraviolet light. The graphene was integrated with silicon microelectromechanical systems (MEMS). As you might expect, there's some U.S. military funding behind the project.

this is an old set of information but we decided to publish it anyway, for 2 reasons, it seems most have little knowledge that is is going on and that we never published it when we had a similar website going a few years ago. But we are back. there is new info on all this as well, prepare to have ya mind blown. we have so many articles to come .

Tuesday, January 22, 2019


download (4).jpeg

Here we go into the void of the fringe once more and find one of many seemingly separate dots that once connected make a much bigger picture. Its people like us @ the EMP and others who have a high curiously about what's next for mankind's technological advancements and have said the next major leap aside from 3D printing is direct biological integration with digital technologies and even quantum computing.

As the founder and main article poster @ EMP I will briefly say man kind seems motivated to be a boot loader for something else even if we don't even collectively realise this. Imagine if an ET race was passing by earth merely to observe ,these ET's will most likely conclude that these weird primate creatures who are at the top of the food chain as far as the land goes are building something and they may not even know it themselves and its competition on that's driving it to one singular point of a leap in evolution if its done properly. or devolution and extinction if its not. like some ironic unconsiounss but colletive self test for the human existance. 

DARPA is the Defence And Research Projects Agency who work on solutions for the U.S. military, They are now revealing to the public information that has previously been highly classified and this involves microchipping the human brain to remotely control more than 1 air craft/drone at one time. And nope we are not kidding. Some of you will prefer to hide in skeptical denial that this is even possible. While it is possible DARPA are using a misdirection tactic for some national security reasoning , perhaps to drive potential hacktivist of the track. (although that hypothesis on its own Beggs another article on the slim chance of such tech being hacked anyway.) ,regardless, We still believe that the The technology's to do this exist .Anyone who thinks it don't exist should understand once you know enough about biology and current technology that it is highly possible regardless of existing or not.

 Of course there is always a moral argument from some that claim it would never be done because of immoral testing on animal subjects with chipped brains ect. Sorry we don't think any military let moral debate stand in the way much when it comes to protecting a whole country. Acording to reports and information direct from darpa a person with a brain chip can now pilot a swarm of drones — or even advanced fighter jets, Yes we know this sounds very Sci Fi to the layman, but in truth its more Sci than Fi. The White House announced the BRAIN initiative in April 2013. this same initiative is supported by several federal agencies and dozens of technology firms, academic institutions, scientists and other key contributors to neuroscience research. DARPA is supporting the BRAIN initiative through a number of programs, this research actually dates way back in its origins to the 1970ts if not before..

DARPA announced that it has sucessfully demonstrated the use of 'telepathic' thought from a human brain embedded with a specific kind of computer chip that allows a person to command and control simultaneously three types of drone aircraft by mental thoughts while watching the drones on a screen,this has been demonstrated in a DARPA simulator. The work also appears to build on publicly known research in 2015, back then this allowed a paralyzed woman to steer a virtual F-35 Joint Strike Fighter with only a small, surgically-implantable microchip. officials said that they have basically scaled up the technology to allow a user to steer multiple jets at once.

To quote Justin Sanchez who directs DARPA's biological technology office 

“As of today, signals from the brain can be used to command and control … not just one aircraft but three simultaneous types of aircraft,” 

More importantly, DARPA was able to improve the interaction between the pilot and the simulated jet to allow an operator, a paralyzed man named Nathan in this case, to not only send but receive signals from the craft.

“The signals from those aircraft can be delivered directly back to the brain so that the brain of that user [or pilot] can also perceive the environment,” said Sanchez. “It’s taken a number of years to try and figure this out.”

Ever since the first military drone took to the sky, pilots have wondered if their days in the fight are numbered.  Programs like 'Loyal Wingman' already aim to take pilots out of the seats of aging fighter jets like the F-16 or F/A-18 Hornet, putting them back in the fight as armed combat drones that would potentially accompany advanced fighters like the F-35 into contested airspace. The F-16 drones could fly support for the more expensive  aircraft that would simultaneously serve at the data transmission hub for the drone wingmen,  and as the drone controller.

Its said that this new  technology that has already been proven capable of allowing a single pilot to control multiple aircraft using nothing but her brain could soon make swarms of drone wingmen a reality in combat zones around the world. The White House announced the BRAIN initiative in April 2013. this same initiative is supported by several federal agencies and dozens of technology firms, academic institutions, scientists and other key contributors to neuroscience research. DARPA is supporting the BRAIN initiative through a number of programs, this research actually dates way back in its origins to the 1970ts if not before..

 DARPA announced that it has successfully demonstrated the use of 'telepathic' thought from a human brain embedded with a specific kind of computer chip that allows a person to command and control simultaneously three types of drone aircraft by mental thoughts while watching the drones on a screen,this was demonstrated in a DARPA simulator.

The work also appears to build on publicly known research in 2015, back then this allowed a paralyzed woman to steer a virtual F-35 Joint Strike Fighter with only a small, surgically-implantable microchip. officials said that they have basically scaled up the technology to allow a user to steer multiple jets at once. to quote Justin Sanchez who directs DARPA's biological technology office 

“As of today, signals from the brain can be used to command and control … not just one aircraft but three simultaneous types of aircraft,” 

More importantly, DARPA was able to improve the interaction between the pilot and the simulated jet to allow an operator, a paralyzed man named Nathan in this case, to not only send but receive signals from the craft.

we will be adding more for your information and curiosity soon. and will add the darpa downloadable PDF'S from darpa's website here . 

additionally you can read more on the darpa brain initiative program @

Thursday, January 3, 2019


yes i know yet another 3D print tech article , we have news that International Business Machines Corp , who are better know as IBM, have filed for a patent relating to '3D printing on the surface of an acoustic hologram.'.This was first submitted in June 2017, the document has now been published online for public knowledge.

we think this is an incredibly ambitious invention which so like so many other 3D printing technologies are not obvious in their potential on their own but vastly paradigm shifting in terms of how it could evolve and in terms of how it can be combined with other technologies., The IBM acoustic hologram 3D printer presents the potential to fabricate a hollow object in one single step.

Although it is somewhat unusual for investigating holograms for their 3D printing potential, it's not an entirely new idea. We have news that many papers exploring the potential of laser light holograms have been in circulation since the early 90'sand possibly before then. Also we most  recently found that DAQRI a Californian augmented reality specialist has been working on many efforts to develop similar technology.

the following image is of a video clip posted by the company in 2017, a green laser hologram is shown 3D printing a paperclip from a dish filled with resin.




The main difference with IBM’s patent compared to other similar tech is that it explicitly relates to an acoustic hologram using sound, rather than the typical hologram made with light.

many have concluded that IBM's current acoustic hologram 3D printer holds more promise as a temporary way of projecting a physical object into a room, than anything permenant, this is still potentially big in design tech though, it could be used as a more physical  tool for CAD (Computer Aidied Design) models. some of the people involved in this project from IBM are Harish Bharti, Abhay K. Patra and Sarbajit K. Rakshit these people are often refered to as the method and system’s inventors. Rakshit and Patra are both known to be valuable Inventors at IBM, with a combined experience at the company of over 20 years. All the co-inventors on the patent are also listed as IBM Architects.

Many are still skeptical on this tech, however if its 1 thing we know of, multiple advances in 3D tech are (As predicted) growing more each day exponentially. 


to quote IBM's team 

'When a 3D-printing medium is applied to the surface of the hologram, the medium solidifies on the hologram’s surface to generate a hollow shell in the shape of the object to be printed.'


Thanks for reading

Article by James Packer

Monday, December 10, 2018


Austen Heinz, is the founder and CEO of Cambrian Genomics,who has shared with the public some incredible news , CG(Cambrian Genomics) are developing hardware and systems for laser printing DNA of living organisms. Before we get to exited in a sci-fi tangent of thought we should get into some details first. this is incredible and has many practical applications for the future and implications.

At the time of writing the current challenge for synthetic biology is that despite scientists now having the potential to create new organisms that can do infinite number of things, the short term cost of synthesizing DNA has been prohibitively expensive. all that is about to change, Presently researchers in the industry order or clone over a billion dollars worth of DNA each year. Austen is developing his company’s technology to change this problem in a big way, he now has a way to provide a new technique to synthesize DNA 10,000 times cheaper than any known technology. One of the difficulties with DNA synthesis is error correction during fabrication to make the correct sequence of A, T, G and Cs – which represent the building blocks of DNA.

Cambrian Genomics have solved this problem by fabricating billions of strands simultaneously, quickly and cost effectively. Their technology optically sequences the strands and then selects the correct DNA sequences using a fast-moving laser.

The company's plan really is to create the print button for biology in order to 'democratise' creation and enable researcher's to literally write code for living organisms. This is fascinating and of course creates some scary perspectives on how this could be used.

. According to Austen the applications of this technology has the potential to be used for personalised medicine, such as cancer treatments, or even bringing back extinct animals using the available DNA code recovered from the fossils. its early on in this technology development , but just like more popularised known forms of 3D printing out there we have seen how fast this kind of tech develops. 


As a writer and researcher I can not emphasis enough when i say '3D printing will leave no aspect of human life untouched'

watch the video below to learn more.

Article by JAMES PACKER.

slight grammar clean up by 


Thursday, November 29, 2018


As 3D printing tech evolves past A mere one material set per print into meta material's

(multiple materials per print into a set matrix to generate intelligent materials and or materials that would not be possible by other conventional means) we might end up seeing advances in wearable technologies with multi-functional attributes,In theory there are many advances that could happen with this. From hydrophobic materials that stay dry to fabric that self repairs! for now I will leave rest to your imagination.

The possibilities really are endless but the cost and time is of course an implication. In the previous article I hinted at atomic level 3D printing. Well how about sub atomic,which would be a quantum level of manipulation.

Technological progress almost never travels on a linear direction but instead goes on an exponential curve, meaning that as time goes by the rate of progress on any technology increases exponentially instead of linearly. 3D printing has been slow in progress and is older than most think but it does fall into this exponential progress category. We have seen many leaps of progress within the last few years alone–and if one quantum materials researcher, Jennifer Hoffman, has her way, we may be on the verge of an incredibly rapid advancement within the industry.

Hoffman is a former professor of physics at Harvard University, she later joined the University of British Columbia (UBC) as the Canada Excellence Research Chair in quantum materials and devices based on oxide heterostructures. At her new position she aims to create a research program that focuses on combining and creating new quantum materials with atomic precision. To take things a step further she is trying to create a method of 3D printing incredibly tiny objects, which can be measured only on an atomic scale.

Of course this may sound like fiction and even a little insane ,however For those unfamiliar with the world of quantum materials, these are materials which are not yet completely understood, even by some of the most advanced scientists. The everyday laws of classical physics cannot explain the almost 'spooky' interactions found within such materials, which can possess incredibly useful magnetic and electronic properties. However The exploratory minds project always seeks to try and offer a relatively reasonable perspective on such things that are not understood in a bigger picture or how the quantum level does have attributes related to the macro. In short we believe the universe and everything in it is 1 thing. there is no separate aspects like space then there's matter. That is an illusion via our perceptions, we claim instead that matter is an extension of the space itself.

These views while in the current climate and time of writing are very different and controversial are often explained in better terms with more details in our videos and articles we recognise that these perspectives are rare in mainstream and so there is much to write about. Offering new perspectives instead of 1 singular view point is far better than using things like a silly big bang theory which never satisfiably backed itself up for decades and never will in many peoples opinion, and while also sounding like a religious miracle, its almost paranormal that it was never slated much earlier in its false acceptance. bear in mind mainstream views the quantum level as almost paranormal too or to quote einstein on quantum entanglement , he described it as 'spooky action at a distance' , but that is only because there is something they are misunderstanding. We breath a sigh of relief though that such things are now being excluded from physics. There is a lot of fragmentation though and heated debate. And so its all the more important to revisit older physics mantras and see how that became just that. a mantra and not a true analysis or scientific discernment.

Ok after all this rant, to bring us back to the matter at hand

here is professor Hoffman's website

Monday, November 26, 2018


Darpa have created an invisibility cloak for sound. This deceivingly simple looking tech could hide objects from sonar although its creation was anything but simple and required some complex math. Engineers from Duke University in Durham, North Carolina, claim that unlike other efforts, the acoustic cloaking device works in all three dimensions,no matter which direction a sound is coming from at this structure it will not give a reliable sound bounce back.

This device looks like a layered pyramidal structure and is constructed using plastic plates with a repeating pattern of holes that are stacked on top of one another. Its design means that sound waves can be manipulated by plastic and air.
The cloak alters the sound waves’ trajectory to match what they would look like had they had reflected off a flat surface.
This is due to the sound wavesnot penetrating the pyramid, but they are travelling a shorter distance, which in turn affects the wave's speed. The device then reroutes these slower sound waves to create the impression that both the cloak and anything beneath it are not there.

below is an image of the device .


Bogdan Popa, a graduate student in electrical and computer engineering, shows off the 3D acoustic cloak he helped design and build as a member of Steven Cummer’s laboratory. If you ever wanted to dupe an enemy sonar into thinking their sonar is hitting nothing this new cloaking device is perfect for such need, this is a mathematical marvel of engineering.

Professor of electrical and computer engineering at Duke, Steven Cummer, explains on Duke’s website that 'the particular trick we’re performing is hiding an object from sound waves.” “By placing this cloak around an object, the sound waves behave like there is nothing more than a flat surface in their path,'

Accomplishing this nifty trick would be difficult without harnessing the power of metamaterials, which result from combining existing materials in ways that force them to alter their properties and behave unnaturally. The acoustic cloak is essentially just plastic and air cleverly combined, according to Cummer. Once the sound hits the structure, the result of painstaking mathematical calculations by engineers comes into play. It goes as far as fooling the sound as to how far it travels, making it compensate for its surroundings. The deceptively simple design is anything but, the professor explains.

“I promise you that it’s a lot more difficult and interesting than it looks. We put a lot of energy into calculating how sound waves would interact with it. We didn’t come up with this overnight,” Cummer said.

The project was clearly a success for Duke, who tested it on a small cloaked sphere and bounced sounds off of it with a microphone, then doing the same without the device. Prospects for development are, understandably, endless the cloak’s creators believe. The cloak could in future be used to fool enemy sonar, as well as for things like architectural acoustics, to compensate for the conflict between sound and the structural requirements of a building.

But this is not the first time the military has thought of something like this. Of course, the ability to cheaply 3D-print it is a huge bonus, but the concept of using metamaterial, as well as the principle of making sound pass around an object go back some time.

America’s Defense Advanced Research Projects Agency (DARPA) has been at it for ages, announcing in 2007 that they’d like to be able to make a whole military vehicle inaudible. In their 2008 defense budget they write that the effort will use directed ultrasound technology to enable the capability to significantly reduce sound emissions from large scale tactical military hardware. Theory predicts that nonlinear effects of high-power acoustic radiation on the atmosphere can cause acoustic energy to dissipate, rather than radiate.


Ok by now anyone interested in the latest technologies will have heard of 3D printing by now. I'm not one to rub it in and shout I told you so! But well I told you so! . As the founder of the exploratory minds project I've always said this tech will change the world and cause a huge paradigm shift. That maybe a little dramatic for some but read on and we shall see why i speak the in ways I do. 


If we think in terms of graphical resolution that gave computers higher accuracy over time , this is really how 3D printing has been evolving since its conception. Increasingly the 3D technologies resolution of smaller and smaller particles to manipulate onto a layer in 3D printing are bringing in improvements of more and more intricate details within a given finite space. I predict at some point an atomic level of manipulation. The practical problem is the higher in particle resolution you go the more time is needed in a creation and more effort to maintain a level of accuracy during the actual production.  One of the most paradigm shifting aspects of 3D printing is in the creation of biological materials. 


On June 14th 2017— 30,000 feet above the gulf of mexico,  a Zero Gravity Corp. (ZERO-G) aircraft took up members of three high-tech companies involved in a technology partnership, NASA contractor Techshot Inc., industrial 3D bioprinter and electronics printer manufacturer nScrypt Inc., and bioink developer Bioficial Organs Inc. went up in the ZERO-G craft.


These type of flights are often used for entertainment/tourism purposes,this  includes space tourism and the government. However this time around the trip up had a much more scientific purpose in mind for the sustained microgravity conditions possible for several seconds at a time. 


To quote Techshot they have a “space hardened 3D bioprinter,” , this printer went up with the team, showcasing in an incredible way each company’s unique offerings and the results of their combined efforts to actually bioprint human tissues and, eventually, organs in orbit!.



'It’s like drawing with a fine-point pen rather than a crayon. Some of the tips on our 3D electronics printers are nearly as small as a single human cell,'  nScrypt Chairman and CEO Kenneth Church, PhD.


Like I say above the resolution is getting higher. once the tech is adapted to the level where you get a resolution of atomic accuracy, you then get to print the molecules that make up cell membranes, probably a very long way off but providing mankind does not destroy itself first we will see this ability I guarantee it. 


The partnership, led by Techshot, will be putting data collected from this test flight to use to continue work on their goals of creating viable, transplantable human organs. Of course this research supports an organ supply for patients here on Earth — but also extend's into the farther reaches of the future when longer-term space travel,or even off planet colonization may be possible. , Executive Vice President and CEO of Techshot John Vellinger, is optimistic about the results from this flight and what it means for the future. 



Vellinger’s next aspiration for this technology is to take what they’ve learned and incorporate it into a next generation  bioprinting, which is set to be more robust and have a smaller footprint. The bioprinter was also intended for launch aboard a commercial Blue Origin suborbital space capsule which may well have already happened as this news dates back to Jun 17, 2016, we haven't found any data or news related to the progress yet. but on board the ISS would be where it would operate autonomously,and planed to print “thicker, more complex tissues,” as Techshot notes. The first expected test run for the ISS bioprinter is a beating human heart patch, as the machine is planned to build in pacing wires and biosensors to the tissues.



This is all going to read a little weird and far out, but to help the reader understand we recommend reading the geometry articles relating to infinity with in the vacuum of space.

If we take into account the geometry articles that arrive with a conclusion of infinite density in the vacuum of space, despite it being 'seemingly' empty or at least far less dense than conventionally perceived. Then propose that it only feels less dense because it's all collapsing in a central point of cancellation giving each point in the space a stillness or equilibrium and no inertia or outward dynamic except where we experience matter as it extends from it, and so we also propose that all matter actually comes from this space. Perhaps the most important aspect here is to remember how we also arrived at a self similarity with the fractal math, and that consciousness appears to be based in a feedback loop itself, to be self aware is to experience the existence of the self coming into the awareness and not outward from it. Otherwise there would be no awareness at all, and so with the fractal math feedback this would also mean all points at what ever scale you wish to measure contain all information of the whole. 

It works better to think of this as information rather than matter or a particle dynamic, or better yet in terms of frequency, With all these concept's in place if matter is a direct extension of space and not a separate thing,does consciousness come from it?  or a direct result of it? since we think looking for consciousness in the brain is erroneous , the brain appears to be a processing tool for consciousness itself. No science today has concluded where consciousness comes from at all, we think they are looking in the wrong place. 

Now this is of course an unconventional in the face of conventional thinking , However many apparent things are so often observed only when individuals take it upon themselves to do so or someone else has inadvertently presented it to them. and of course things are often interpreted differently via other perspectives. So  what are the observable effects we can interpret to support the concept that consciousness is not created in the brain?

How about that time you thought of a friend or family member then they go and ring you in the same moment , or arrive at your door? Nearly everyone that exploratory minds has asked has had something similar to this that they can remember. Is this mere accidental co-coincidence?  or  random chance. We are proposing while we experience consciousness subjectivity we are connected collectively via this infinite feed back or fractal dynamic that has all information of the whole inside all points, change any point no matter how small and the whole thing changes not just that point in space and time, but the whole universe . This is what happens when we alter the math no matter how small for a fractal the results are reflected in the whole thing because the sum result is fed back into the whole orignal formula. This could also mean it's a collective consensus reality via these very subtle changes from all the thinkers and consious effects of our time even those who are not the human race, since , or course, there is more than one conscious effect and intention in this universe 

Now when it comes to theories in so-called psychic experience there are hundreds of frauds ,with mind sets based in mere egocentric self-importance and a goal to con people, We think that directly claiming to perceive something in conventional details would equate to the person having knowledge of what they are perceiving prior to acting out a so-called psychic skill. 

 If such a super connectivity of the fractal concept is a real thing and that consciousness really isn't in the brain but in everything or perhaps like a universal field as another way to think of it.  then we should be able to harness this via deliberate focus or attention to bring in data without having to be in a given location that relates to any data being received from that location or object.  Experimenter's have given target's to people who claim such skill in a way that reveals no conventional data about the target , instead the experimenter has given a random 4 digit number assigned to an object or place for the person to focus on, a set of information related to the target ends up coming in as various aspects. Since they would not be using the typical conventional senses in a conventional manner.  Of course we know how being present in a location and looking and perceiving works, its plain and simple. But this other way has actually been harnessed and practised by none other than the CIA. we kid you not.

 The CIA headed up a project named STARGATE in the 70tes after taking a lot of experiments and reworking a various set of practises and skills they ended up with something very interesting. the skill became known as 'remote viewing' 

and it did indeed show aspects of non-locality, meaning that all points in space and time are equally information rich. And that all the seasoned and skilled remote viewer needed was a tag or number assigned to a target regardless of distance or time they would come up with accurate aspects of the targets given. 

there are many forms of applying the skill and in early experiments some of the skill sets included listing various aspects. The following would be a typical list, while the remote viewer took pen to paper. 


what does the object /target feel like:

what shapes are there:

is there a light source:

is there sound:

what smells are there:

The list was extensive but highly important tools to discern what could be matched with the real target later. It was found imagination sometimes caused the remote viewer to lose the signal and come up with things like football for a spherical object. Results like this lead the training to stop use of common nouns like 'ball'' and use sphere or spherical instead. This caused more valid accuracy in results. Most of the data is never first percived as a image persay, but more of a feeling of a shape or a kenesthtic. Often if an image comes in clear and sharp early on its usually just imagnation or AOL (analytical overlay) 

below we have a video of Dr hal putoff's experiences and research in remote viewing with the stanford research institute (SRI) who also worked with the cia for the research.



- page 1 of 2