Wednesday, April 25, 2012

A recent study by the United States Geological Survey (USGS) shows that the oil and gas industry are creating earthquakes. New information from the Midwest region of the United States points out that these man-made earthquakes are happening more frequently than expected. While more frequent earthquakes are less of a problem for regions like the Midwest, a geology professor from the University of Southern Indiana, Dr. Paul K. Doss, believes the disposal of wastewater from the hydraulic fracturing (or “fracking”) process used in extracting oil and gas has the possibility to pose potential problems for groundwater.

“We are taking this fluid that has a whole host of chemicals in it that are useful for fracking and putting it back into the Earth,” Doss said. “From a purely seismic perspective these are not big earthquakes that are going to cause damage or initiate, as far as we know, any larger kinds of earthquakes activity for Midwest. [The issue] is a water quality issue in terms of the ground water resources that we use.”

Hydraulic fracturing, or fracking, is a technique used by the oil and gas industries which inject highly pressurized water down into the Earth’s crust to break rock and extract natural gas. Most of the fluids used for fracking are proprietary, so information about what chemicals are used in the various fluids are unknown to the public and to create a competitive edge.

Last Monday four researchers from the University of New Brunswick released an editorial that sheds light on the potential risks that the current wastewater disposal system could have on the province’s water resources. The researchers share the concern that Dr. Doss has and have come out to say that they believe fracking should be stopped in the province until there is an environ­mentally safe way to dispose the waste wastewater.

“If groundwater becomes contamin­ated, it takes years to decades to try to clean up an aquifer system,” University of New Brunswick professor Tom Al said.

While the USGS group which conducted the study says it is unclear how the earthquake rates may be related to oil and gas production, they’ve made the correlation between the disposal of wastewater used in fracking and the recent upsurge in earthquakes. Because of the recent information surfacing that shows this connection between the disposal process and earthquakes, individual states in the United States are now passing laws regarding disposal wells.

The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak.

“The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak,” Doss said. “A perfect case-in-point is the Gulf of Mexico oil spill in 2010, that oil was being drilled at 18,000 feet but leaked at the surface. And that’s the concern because there’s no assurance that some of these unknown chemical cocktails won’t escape before it gets down to where they are trying to get rid of them.”

It was said in the study released by the New Brunswick University professors that if fracking wastewater would contaminate groundwater, that current conventional water treatment would not be sufficient enough to remove the high concentration of chemicals used in fracking. The researchers did find that the wastewater could be recycled, can also be disposed of at proper sites or even pumped further underground into saline aquifers.

The New Brunswick professors have come to the conclusion that current fracking methods used by companies, which use the water, should be replaced with carbon diox­ide or liquefied propane gas.

“You eliminate all the water-related issues that we’re raising, and that peo­ple have raised in general across North America,” Al said.

In New Brunswick liquefied propane gas has been used successfully in fracking some wells, but according to water specialist with the province’s Natural Resources De­partment Annie Daigle, it may not be the go-to solution for New Brunswick due its geological makeup.

“It has been used successfully by Corridor Resources here in New Bruns­wick for lower volume hydraulic frac­turing operations, but it is still a fairly new technology,” Daigle said.

The United States Environmental Protection Agency (EPA) is working with U.S. states to come up with guidelines to manage seismic risks due to wastewater. Under the Safe Drinking Water Act, the EPA is the organization that also deals with the policies for wells.

Oil wells, which are under regulation, pump out salt water known as brine, and after brine is pumped out of the ground it’s disposed of by being pumped back into the ground. The difference between pumping brine and the high pressurized fracking fluid back in the ground is the volume that it is disposed of.

“Brine has never caused this kind of earthquake activity,” Doss said. “[The whole oil and gas industry] has developed around the removal of natural gas by fracking techniques and has outpaced regulatory development. The regulation is tied to the ‘the run-of-the-mill’ disposal of waste, in other words the rush to produce this gas has occurred before regulatory agencies have had the opportunity to respond.”

According to the USGS study, the increase in injecting wastewater into the ground may explain the sixfold increase of earthquakes in the central part of the United States from 2000 – 2011. USGS researchers also found that in decades prior to 2000 seismic events that happened in the midsection of the U.S. averaged 21 annually, in 2009 it spiked to 50 and in 2011 seismic events hit 134.

“The incredible volumes and intense disposal of fracking fluids in concentrated areas is what’s new,” Doss said. “There is not a body of regulation in place to manage the how these fluids are disposed of.”

The study by the USGS was presented at the annual meeting of the Seismological Society of America on April 18, 2012.

Retrieved from “https://en.wikinews.org/w/index.php?title=Disposal_of_fracking_wastewater_poses_potential_environmental_problems&oldid=3931361”


Submitted by: Scott Lindsay

There is a political strategy that has been used successfully for several years now. The idea is to support and push for candidates with low negatives.

This philosophy tends to suggest that a winning candidate may not have the overall highest degree of positives, but they absolutely must have low negatives.

If a skeleton is found in the closet by the news media this is a negative and will be heavily reported on in most cases. Since it is normal for the media to fixate on bad or negative news it stands to reason that negative press will gain the most coverage and result in a fluctuating opinion of the candidate.

While everyone was pushing for a very positive candidate at one time the new trend is to get behind someone who has a background with few potential negative reports.

The same is true with your Internet business and subsequent marketing. You want to make sure that above all else there are very few downsides to the product or service you are marketing.

[youtube]http://www.youtube.com/watch?v=1UU-foYT3dY[/youtube]

By all means accentuate the positive. After all it is your website so hit the high points at will. Just make sure that you back products you truly believe have low negatives or have been adequately proven to have very little downside.

If you have a customer who discovers a flaw in your product and you don t want to address the issue that individual can spread word quickly via the web and through consumer oriented feedback sites.

Let me be clear, your product doesn t have to be the absolute best product ever produced, but it does need to do what you say it will do. Negative press can adversely affect your business.

If you can present your business with the aim of satisfying 80% of your consumers while not leaving a bad taste in the proverbial mouths of the other 20% then you can likely experience continued growth and customer support. Once negatives get beyond 20% they start causing your business problems.

For instance if you have a product that you feel could benefit from marketing the link between the product and a reduction in carbon output then you would likely benefit from this marketing strategy. If, however, you have a sense that this may be viewed as controversial then you can decrease potential negatives by not making this issue a cornerstone of your marketing plan.

We ve all encountered products that are pushed heavily. The product seems to be flying off the shelf, but the business is abandoned in a short period of time. The reason is often due to known deficits in the product and the business owner just wanted to dump and run – get rid of the product before the consumer figures out it wasn t worth it.

If you want to be in business online for the long-term you may want to consider choosing products with low negatives. The idea is to find and sell products that people have a heard time saying anything bad about.

When you devise a marketing plan based around controversy you intentionally polarize your client base and force yourself into a niche market. That market could be lucrative, but it could also be a frustrating disappointment if you should alienate otherwise willing buyers.

About the Author:

Make A Website

in minutes with the

Website Builder

at HighPowerSites.com. Start a HOME BUSINESS and

Sell Ebooks

at BooksWealth.com.

Source:

isnare.com

Permanent Link:

isnare.com/?aid=324165&ca=Business



Monday, August 16, 2021

On Sunday, the Taliban moved into Kabul, the Afghan capital. Reports say they have seized the presidential palace. The move comes as Afghan president Ashraf Ghani fled Kabul, possibly for Tajikistan or Uzbekistan, according to The Independent.

According to the previous report by the BBC, Taliban spokesman Zabihullah Mujahid said the Taliban had been ordered to enter parts of Kabul for security purposes, as security forces had recently left. He said, “Our forces are entering Kabul with all caution” and that the militants were told not to hurt the public or enter their homes.

The United States has responded by deploying 5000 soldiers to help remove their embassy staff and the Afghans who assisted them, while around 600 British soldiers have been deployed for similar purposes, the BBC stated. Other countries are in the process of removing their own citizens from the country.

The US recently retreated from Afghanistan after nearly 20 years of occupation. The US president Joe Biden said, “endless American presence in the middle of another country’s civil conflict” could not be justified. Many Afghan civilians have also recently fled to Kabul to escape growing Taliban control, with the Taliban controlling the majority of the country’s territory. Many residents have fled to the city’s airport to leave the country, including some who abandoned their cars in traffic jams and opted to walk instead.

Retrieved from “https://en.wikinews.org/w/index.php?title=Afghan_Taliban_occupies_Kabul_as_president_Ghani_flees_the_country&oldid=4634917”


See the discussion page for instructions on adding schools to this list and for an alphabetically arranged listing of schools.

Due to the damage by Hurricane Katrina and subsequent flooding, a number of colleges and universities in the New Orleans metropolitan area will not be able to hold classes for the fall 2005 semester. It is estimated that 75,000 to 100,000 students have been displaced. [1]. In response, institutions across the United States and Canada are offering late registration for displaced students so that their academic progress is not unduly delayed. Some are offering free or reduced admission to displaced students. At some universities, especially state universities, this offer is limited to residents of the area.

Retrieved from “https://en.wikinews.org/w/index.php?title=Colleges_offering_admission_to_displaced_New_Orleans_students/LA-ND&oldid=4617833”


Monday, May 17, 2010

A seven year-old girl was shot and killed Sunday during a police raid in Detroit, Michigan, United States when one of the officer’s guns was accidentally set off during an argument with the girl’s grandmother. Police were raiding the home in search of a 34 year-old male suspect accused of murdering teenager Jarean Blake near a local liquor store the previous night.

According to police reports, Mertilla Jones got into a verbal confrontation with a police officer, which quickly turned physical. The woman, according to the police, then came into contact with the officer which inadvertently set off the gun that he was carrying. The bullet then struck the girl, Aiyana Jones, in the neck while she was sleeping on the family’s couch.

The murder suspect was apprehended later that day. However, the family and public became outraged at the killing of seven year-old Aiyana. In response, the Detroit Police Department held a press conference.

“We cannot undo what occurred this morning […] All we can do is to pledge an open and full investigation and to support Aiyana’s family in whatever way they may be willing to accept from us at this time,” said Assistant Police Chief Goodbee speaking on behalf of the Police Chief, who was on vacation at the time of the shooting.

Meanwhile the girl’s father, Charles Jones, was outraged about the incident. “She had a lively, free spirit,” said Jones. “They came into my house with a flash grenade and a bullet […] They say my mother [Mertilla] resisted them, that she tried to take an officer’s gun. My mother had never been in handcuffs in her life. They killed my baby and I want someone to tell the truth,” he added.

The incident is currently under investigation by the Detroit Police Department. It is not known who fired the shot that killed the girl, or whether or not they will receive any disciplinary action from the department.

Retrieved from “https://en.wikinews.org/w/index.php?title=Seven-year-old_girl_killed_in_Detroit,_Michigan_police_raid&oldid=4512466”


byAlma Abell

People relocate for many reasons. Dentists like to do the same thing. They build up a patient list and then sell their practice to a dentist who will take care of them. This allows one dentist to build a life in another region of the country, while allowing another one the opportunity to begin their own practice, with patients. The patients are very important to both dentists. Even though he/she can purchase the office space with all the dental equipment, the important thing is to have clients who need the modern procedures both dentists are familiar with.

If you’ve just read a want ad in the real estate section of the newspaper stating “Sell Dental Practice in San Francisco Bay Area” chances are that it’s being sold by a special real estate group. This group would be dentists and brokers who know the real estate business and all about what it takes to buy or sell a dental practice. Some dentists would be acclimated to the classic patient in the affluent areas who can afford dentistry, no matter the procedure involved. These individuals want the bright, straight smile and possible enhancements to lines around the nose and mouth. This type of office, with patient records included, would cost much more than just office space with dental equipment.

Like many other people who are ready to retire, dentists who’ve been taking care of their patient’s teeth for as long as 40 years want to begin traveling and spending time with grandchildren. They decide to get help in selling their business and you can, too. If you would like more information on how you can obtain a fair price for a practice you spent years building, click here. If you’re a person wanting to buy such a practice, you can be brought together with a seller who is anxious to sell. The important thing to note is these are dentists and brokers with years of experience in dental practice and real estate sales. They take the time to talk with each client in order to understand the goals of a buyer and the seller.

Give one of the companies that assists with buying or selling a dental practice a call today. The next time you read an ad that states “Sell Dental Practice in San Francisco Bay Area” it will be the ad that they placed for you.



Thursday, August 16, 2007

The name Robert Cailliau may not ring a bell to the general public, but his invention is the reason why you are reading this: Dr. Cailliau together with his colleague Sir Tim Berners-Lee invented the World Wide Web, making the internet accessible so it could grow from an academic tool to a mass communication medium. Last January Dr. Cailliau retired from CERN, the European particle physics lab where the WWW emerged.

Wikinews offered the engineer a virtual beer from his native country Belgium, and conducted an e-mail interview with him (which started about three weeks ago) about the history and the future of the web and his life and work.

Wikinews: At the start of this interview, we would like to offer you a fresh pint on a terrace, but since this is an e-mail interview, we will limit ourselves to a virtual beer, which you can enjoy here.

Robert Cailliau: Yes, I myself once (at the 2nd international WWW Conference, Chicago) said that there is no such thing as a virtual beer: people will still want to sit together. Anyway, here we go.

Retrieved from “https://en.wikinews.org/w/index.php?title=Wikinews_interviews_World_Wide_Web_co-inventor_Robert_Cailliau&oldid=4608361”


Wednesday, April 25, 2012

A recent study by the United States Geological Survey (USGS) shows that the oil and gas industry are creating earthquakes. New information from the Midwest region of the United States points out that these man-made earthquakes are happening more frequently than expected. While more frequent earthquakes are less of a problem for regions like the Midwest, a geology professor from the University of Southern Indiana, Dr. Paul K. Doss, believes the disposal of wastewater from the hydraulic fracturing (or “fracking”) process used in extracting oil and gas has the possibility to pose potential problems for groundwater.

“We are taking this fluid that has a whole host of chemicals in it that are useful for fracking and putting it back into the Earth,” Doss said. “From a purely seismic perspective these are not big earthquakes that are going to cause damage or initiate, as far as we know, any larger kinds of earthquakes activity for Midwest. [The issue] is a water quality issue in terms of the ground water resources that we use.”

Hydraulic fracturing, or fracking, is a technique used by the oil and gas industries which inject highly pressurized water down into the Earth’s crust to break rock and extract natural gas. Most of the fluids used for fracking are proprietary, so information about what chemicals are used in the various fluids are unknown to the public and to create a competitive edge.

Last Monday four researchers from the University of New Brunswick released an editorial that sheds light on the potential risks that the current wastewater disposal system could have on the province’s water resources. The researchers share the concern that Dr. Doss has and have come out to say that they believe fracking should be stopped in the province until there is an environ­mentally safe way to dispose the waste wastewater.

“If groundwater becomes contamin­ated, it takes years to decades to try to clean up an aquifer system,” University of New Brunswick professor Tom Al said.

While the USGS group which conducted the study says it is unclear how the earthquake rates may be related to oil and gas production, they’ve made the correlation between the disposal of wastewater used in fracking and the recent upsurge in earthquakes. Because of the recent information surfacing that shows this connection between the disposal process and earthquakes, individual states in the United States are now passing laws regarding disposal wells.

The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak.

“The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak,” Doss said. “A perfect case-in-point is the Gulf of Mexico oil spill in 2010, that oil was being drilled at 18,000 feet but leaked at the surface. And that’s the concern because there’s no assurance that some of these unknown chemical cocktails won’t escape before it gets down to where they are trying to get rid of them.”

It was said in the study released by the New Brunswick University professors that if fracking wastewater would contaminate groundwater, that current conventional water treatment would not be sufficient enough to remove the high concentration of chemicals used in fracking. The researchers did find that the wastewater could be recycled, can also be disposed of at proper sites or even pumped further underground into saline aquifers.

The New Brunswick professors have come to the conclusion that current fracking methods used by companies, which use the water, should be replaced with carbon diox­ide or liquefied propane gas.

“You eliminate all the water-related issues that we’re raising, and that peo­ple have raised in general across North America,” Al said.

In New Brunswick liquefied propane gas has been used successfully in fracking some wells, but according to water specialist with the province’s Natural Resources De­partment Annie Daigle, it may not be the go-to solution for New Brunswick due its geological makeup.

“It has been used successfully by Corridor Resources here in New Bruns­wick for lower volume hydraulic frac­turing operations, but it is still a fairly new technology,” Daigle said.

The United States Environmental Protection Agency (EPA) is working with U.S. states to come up with guidelines to manage seismic risks due to wastewater. Under the Safe Drinking Water Act, the EPA is the organization that also deals with the policies for wells.

Oil wells, which are under regulation, pump out salt water known as brine, and after brine is pumped out of the ground it’s disposed of by being pumped back into the ground. The difference between pumping brine and the high pressurized fracking fluid back in the ground is the volume that it is disposed of.

“Brine has never caused this kind of earthquake activity,” Doss said. “[The whole oil and gas industry] has developed around the removal of natural gas by fracking techniques and has outpaced regulatory development. The regulation is tied to the ‘the run-of-the-mill’ disposal of waste, in other words the rush to produce this gas has occurred before regulatory agencies have had the opportunity to respond.”

According to the USGS study, the increase in injecting wastewater into the ground may explain the sixfold increase of earthquakes in the central part of the United States from 2000 – 2011. USGS researchers also found that in decades prior to 2000 seismic events that happened in the midsection of the U.S. averaged 21 annually, in 2009 it spiked to 50 and in 2011 seismic events hit 134.

“The incredible volumes and intense disposal of fracking fluids in concentrated areas is what’s new,” Doss said. “There is not a body of regulation in place to manage the how these fluids are disposed of.”

The study by the USGS was presented at the annual meeting of the Seismological Society of America on April 18, 2012.

Retrieved from “https://en.wikinews.org/w/index.php?title=Disposal_of_fracking_wastewater_poses_potential_environmental_problems&oldid=3931361”


Sunday, May 28, 2006

Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.

Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.

Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?

Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first [US]] “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.

Is the video available to the public?

Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So

We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.

Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.

How many samples do you anticipate being found during the course of the project?

Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”

How big would the samples be?

We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.

And that’s on the main Stardust@home website [see below]?

Yes.

How long will the project take to complete?

Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!

And this is the huge advantage with this kind of a mission — a “sample return” mission.

Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!

When do you anticipate the project to start?

We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.

And rest assured that we’re just as frustrated!

I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?

The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!

Why did NASA decide to take the route of distributed computing? Will they do this again?

I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…

If I understand correctly it isn’t distributed computing, but distributed eyeballing?

…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).

That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen

Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?

That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?

Will a project like this be done again?

I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.

How did the idea come up to do this kind of project?

Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)

I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?

That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.

Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?

These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!

I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?

Can you elaborate on your question a little — I’m not sure that I understand…

Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?

That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.

I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.

I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!

Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?

The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.

We have some fun things, including micromachines.

How many people/participants do you expect to have?

About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!

One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!

Retrieved from “https://en.wikinews.org/w/index.php?title=Keep_your_eyes_peeled_for_cosmic_debris:_Andrew_Westphal_about_Stardust@home&oldid=4608360”


Top Utility Trailers

by

esteban

Utility trailers are a efficient tool that is able to fit on nearly any vehicle. This helpful accessory will help you move thousands of pounds without putting any overload onto your compact or midsize vehicle. Knowing the different forms of these trailers can certainly make searching for the ideal trailer a more simple task. On this page Let me present to you three utility trailer solutions that you will dsicover useful in the time to come.

Utility trailer part number LCI-830T features a 48in. x 40in. bed. This trailer is perfectly road-ready and also has fenders. It is extremely uncomplicated to assemble and also it even comes with the basic tools. It provides one 7/8 inch coupler assembly with safety chains, approved lighting, wiring and connector, trail and turn signals, side running lights, high-speed wheels and tires, and slipper springs. It is easy to add plywood. The gross vehicle weight is 1180 lbs., and also the load capacity is 1060 lbs.

[youtube]http://www.youtube.com/watch?v=eK68Y3oMEk8[/youtube]

Utility trailer part number LCI-958TR is a 5 x 8 trailer which includes a ramp. Its overall length is 135-1/2 , and its overall width is 76 . This utility trailer is incredibly helpful since a ramp is already intended to insure loading things less of a challenge. The ramp width is 57-1/2 along with the ramp length is 53 . This utility trailer isn’t going to include ply wood and its payload capacity is 1,585 lbs.

Utility trailer part number LCI-826T is a 12 Cubic Ft. Off-Road ATV Trailer. This off-road utility trailer holds a capacity of 700 lbs. This trailer has two 22 x 11-8 tubeless knobby tires that ride smoothly over dirt or rocky terrain. Absolutely is built tough enough to successfully follow your ATV into any kind of terrain to which you are going. The spacious 42in L x 30in W x 16in cargo bed is perfect for hauling feed, landscaping supplies, dirt, or equipment. Its tailgate makes it easy to help you load or unload cargo.

In summary , paying close observation to the specs of each utility trailer is going to help you for you to be able to choose the right solution for your truck. You can find these

utility trailers

and more

truck accessories

at our discount truck accessories website http://www.discounttruckaccessories.com/. Now you realize what to look at when acquiring these types of utility trailers begin buying the one which fits your needs.

Article Source:

ArticleRich.com



« Older Entries Newer Entries »