How This Hurricane Season Is Affecting U.S. Oil

Just when it seemed U.S. oil couldn’t be stopped, hurricane season 2017 arrived to rain on parade. The National Oceanic and Atmospheric Association predicts 2017 will bring 11 to 17 tropical storms and up to four massive hurricanes between June and December.

For offshore oil platforms, signs of a nasty season mean it’s time to baton down the proverbial hatches. When weather forecasters predict conditions like this seasons, all non-essential personnel are evacuated from platforms to ensure their safety. While this is a drill that rig operators have been through in the past, every storm is different, and you can’t be too prepared for the chaos a hurricane can bring.

Preparing for the Storm

Making an offshore rig ready to sustain a hurricane is a delicate balancing act between protecting the employees who work on the platform and safeguarding as much oil production as possible.

The very real impacts that hurricane season can have on production make every last operating hour crucial, so personnel essential to rig operation are allowed to remain aboard until a few days before the storm. Sometimes it can be less than a day, but well-trained crews know how to stay professional even under pressure because failure could mean a natural disaster.

Within a few day of the storms arrival, drilling stops and all personnel are evacuated. Drill ships that are in the potential path of the storm are relocated to safe waters. The unpredictable nature of storms makes it necessary to stop operations even outside of the direct path of the hurricane.

Technology is the biggest asset oil manufacturers have in the fight against storms like Harvey, Irma and Jose. Modern oil rigs are equipped with GPS systems that allow supervisory staff to monitor their positions during and after the storm and locate them should the rig be pulled away from its drilling location by storm surges.

Restarting Operations

This year’s flurry of storms poses a grave threat to America’s prominent position in the global oil market because of its impact on multiple critical areas for US oil production. Hurricane Harvey struck Texas’ gulf coast, which is home to 45 percent of American refining capacity.

Add to that the offshore operations in the Gulf, which account for 17 percent of crude oil production, and now the rigs struck by Irma and Jose, and you have the makings of a disaster.

Once again, technology will be essential in restoring production capacity as quickly as possible. Many offshore rigs are designed around lean manufacturing principles. Assuming they can endure the winds and waves, that should help get oil production on its feet as quickly as possible.

Lean manufacturing practices focus on reducing waste in the form of motion, downtime, over-processing and four other potential inefficiencies. By allowing an oil rig to continue producing up to days before a storm hits, and restart operations with minimal crew, these practices can help recover days of production time.

Assessing Damage

No amount of preparation can guarantee that sensitive equipment won’t be damaged in the course of a storm, which is why drilling companies practice special flyover and assessment procedures to determine if offshore sites are safe to send personnel back to following a massive storm.

Remobilization, or “re-mob” as it’s called, is the process of gathering all company assets and ensuring they’re safe to continue work before beginning drilling operations again. Following an assessment by helicopter, small teams are dispatched to rigs and ships to determine if everything is in working order.

The ability to track every single asset using GPS makes the process of finding ships and platforms simpler than it was in the past, but the real challenge comes in repairing damaged equipment after a storm. It can take days or weeks to repair complex extraction equipment with crews sometimes working round-the-clock to get a significant drilling facility back online.

Ultimately, the small teams can bring rigs and ships back online. Once operational, assets can begin receiving more personnel. It’s a race against the clock every time, and this year it looks like those assessment crews are going to get more than their fair share of practice.

Frozen Tissue Array Methodology, Applications and Benefits

Frozen tissue array is a methodology that is used in modern molecular and clinical research to analyze hundreds of tumor samples on a single slide. It allows a high throughput analysis of proteins and genes in a huge unit. It consists of frozen tissues where separate tissue cores are lumped together to allow simultaneous histological analysis. It has made it easy to streamline several research projects thus saving significant time. It also conserves precious reagents for analysis numerous slides that contain a single section per slide. It is an ideal screening tool that is used before

embarking on extensive research and analysis.

Preparation of frozen tissue array

Each product is produced using the state-of-the-art preparation technique by the use of the finest quality specimens. Upon excision, the tissues are then placed in liquid nitrogen and then sorted meticulously by an expert pathologist. Cores from 20 different tissues or more or with pathologically relevant tumors are then combined in a single block. With the use of unique staining methods, the quality of each
slide is selected. Tissues with a diameter of 2 mm from the region of interest
are sorted from frozen tissue OCT blocks by varying their freezing temperatures, see more here.

Features of frozen tissue array

Every product is designed to conform to the FDA guidelines and must meet the requirements of therapeutic antibody validation and vitro diagnostic device certification. There is a vast range of tissues in every array. The technique is suitable for both radioactive and non-radioactive detection. It combines arrays from variety human donors. Compared to paraffin-embedded tissues, frozen array tissue contains better antigen exposure.

Frozen Tissue array applications

The technique has been employed in various areas such:

  • Rapid screening of protein expression or novel gene against a large panel of tissues
  • Diagnostic and high throughput therapeutic analysis in antibody
    variations
  • Analysis of gene expression patterns
  • In situ hybridization and used together with immunohistochemistry
  • Novel gene and protein expression comparison
  • It is also an excellent approach in FISH-based experiments in the
    analysis of DNA. In summary, frozen tissue array provides an excellent target
    material for an effective study of RNA, DNA, and proteins.

Samples of DNA, RNA, and certain antibodies don’t perform optimally when used in pre-fixed paraffin-embedded tissues. However, they work pretty well when used in frozen tissue array. Again, the procedures that require fixation can be identified and conducted in an appropriate manner. This means it is possible for you to include a wide array of samples in your final analysis than when using the paraffin-embedded
procedure
The only drawback with frozen tissue array is that some cell morphology and tissue architecture distortion is likely to occur. This can be seen by comparing it with the sections from paraffin-embedded. Additionally, a limited number of samples can be embedded in one array. This is due to the fact that there may be a tendency of OCT compound cracking or bending particularly when samples are placed one millimeter apart.

Conclusion 

The invention of this technique has become a boon to many scientists from around the world. It has saved scientists and pathologists significant time when conducting several tests. It also has numerous potential applications in basic research,
prognostic oncology, and drug discovery.

Determining methods of Automated Nucleic Acid Extraction

By Lorenzo Gutierrez

Scientific exploration- Determining methods of Automated
Nucleic Acid Extraction

The human body is a complex structure made up of various cells and genes. The central system of genetic identification for humans is focused on one’s DNA, that is deoxyribonucleic acid. It is present in nearly all living organisms as it defined as the main constituent of chromosomes. With the introduction of a variety of communicable diseases, it is pertinent to researches to be able to extract DNA. They do this to run various tests to see how best the world’s population can extend its life cycle through science.

What is Automated Nucleic Acid Extraction?
This speaks to the removal of DNA by mechanical/ automated means. Extraction by this mean is deemed to be more accurate and more beneficial to science as it lessens the margin of error, or so it is alleged. “Automated nucleic acid extraction systems can improve workflow and decrease variability in the clinical laboratory.”[1]There are various methods that can be accessed. As science evolves so does technology and technological research is by extension advanced.

Methods of Automated Nucleic Acid Extraction
There are various methods of extraction and various machinery used by researchers on a day to day basis in efforts to attain much needed samples of DNA. This is done as the fight towards cures for many communicable diseases is a rather tedious process. Let us face the fact that technology is put in place to lighten the work load of many and aid in movements towards more accurate results. Many companies have delved into the creation of different extractors that each operate at varied levels. Some of which were created to be work horses, thereby being able to complete massive amounts of work while others are able to only produce an average turn out. Laboratories vary by size and as such, they would be able to best choose an extractor of their liking to perform their work functions.

There is the manual means of extractions, you can refer to this as good old reliable. Researchers are incredibly consumed by work when they have to utilize manual extraction methodologies as it is incredibly hands on. Of course, there is the usage of some level of technology however, the researcher would need to be present to adjust variables and incorporate other items as the need arises.
Automated Extractors allow researchers the ability to set their research in the machines and be able to leave to complete other tasks. Researchers aren’t needed at every step during automated extractions as technology does most of the work once it is that the samples are prepared and placed therein. It must be noted that with the presence of great technology, companies also incur a greater cost. Where a manual extraction could be performed at approximately $5, the work of an automated extractor could range anywhere from $7.60 to $12.95 per sample.
You may find that, true to human nature, researchers will gravitate towards a more established extractor as it had been around longer and there had been numerous reports done on it. However, it is important to still venture out and try new machinery as prior to the one that is most renowned became that way it was merely extractor X for argument sake, an unknown machine with the potential to create an ease of workload.

Research of two methods [2]
For the purposes of this article we will look at a particular research performed by a group of research scientists, their information will be provided below. After comparing the three methods of extraction, It could be determined that the first extractor; X was reasonably efficient as it varied from 86% to 107% of manual. The second extractor Y’s recovery efficiency in comparison to the manual method varied from 83%-107%. Though the results varied marginally the true variation of extraction came by way of cost. As the extractor X was the most costly means at $12.95 per sample, whilst the Y costed $7.60. There is also a key difference in operational actions as the X doesn’t allow for the researcher to walk away, leaving the machine to perform its extraction. The X also needs a higher volume of samples to perform its task. Automated Nucleic Extraction is a field of science that is beneficial to researchers as it yields greater results than manual extraction. It is however a more costly approach.

[1] Dundas N., Leos N.K., Mitui M., Revell P., Rogers B.B. (2008 June 13) Comparison of automated
nucleic acid extraction methods with manual extraction.

[2] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2438199/
Retrieved August 3, 2017