Friday, August 12, 2016

Lab Internship: Week 6 (Final Week)

So everything got deleted as I was writing, so I have to rewrite everything on Thursday, 8/11.  Not everything will be accurate since I am not writing on the day of.  I'll try my best though.

Monday 8/8
Victor is planning on doing another experiment with me.  Originally, it was supposed to be the time lapse experiment, but because of tricky scheduling, he decided not to do it.  Instead, we are doing an experiment with RELA, a transcription factor in the NF-kB family.  He wants to find the protein concentration in cells and how much translocates into the nucleus.  RELA is generally in the cytoplasm of the cell, bound by specific receptors inside the cell.  When perturbed, the RELA break away from the receptors and go into the nucleus to regulate transcription.  We are not perturbing the cells with any external stimuli, such as TNF, so we can accurately image the RELA concentrations in the cells.  Victor also wants to fix the cells so they do not escape the traps, like they've been doing in my TNF experiments.

Tuesday 8/9
I attended a thesis defense for a graduate student named Stacey Lawrence.  It was at the Kline Geology building behind the Peabody.  She presented her research on how G proteins affect the immune receptors/response (I think, I forgot since I accidentally deleted everything I wrote.  I also fell asleep halfway through the presentation, whoops).  The closed Q&A session with the thesis committee, which usually go on for about 15-20 minutes, according to every grad student I asked, lasted for over an hour.  The food from Thali was worth it, though.

I went back to lab and started on my experiment.  I did the first stain of NF-kB antibodies.  The antibodies will attach themselves to the RELA.  I also "fixed" the cells, meaning that I killed them so they become rigid.  It's not apoptosis.  I also had to permeabilize the cells to allow the stain to get inside of the cell.  I used ice-cold MeOH to permeabilize.  Don't know how it works, never asked (according to grad students, as long as it works, you don't ask questions).  I then put the device with the cells in a container and into the fridge to leave overnight.  Victor wrapped the container in parafilm to avoid evaporation.  This would cut off the exchange of gases, but since the cells are already dead, it does not matter.

Wednesday 8/10
I started work in the lab early, and by early, I mean I started as soon as I walked in.  I applied the 2nd stain to the cells.  The stain was called Alexa 488, a molecular probe that attaches itself to antibodies.  The devices had really good trapping efficiency, with over 80% of the traps filled with at least one cell.  Since the devices were in the fridge overnight, there was condensation on the devices, so it was somewhat difficult to see the channels at first.  I also added a Hoechst stain so I can identify the DNA and the nucleus in the cells.

There was a lab meeting today where Victor Bass, another graduate student, was presenting his progress on his research.  He was studying TNF and LPS stimulation.  I did not really understand everything because of the graphs, since they always mess me up.  Andres was supposed to present a paper, but he forgot, so it was a short meeting.

After the lab meeting, I imaged my cells in the devices.  I took 10 pictures of at least 15 cells, so at least 150 cells, max 200.  HIV was activated in some cells, making the cells appear as if they're from the Matrix.  Since they were so bright, I could not see the RELA concentrations in the cells, so I could not quantify those.  I could only quantify the cells without active HIV but had RELA, which was all of the cells.  I have to analyze the intensity of the light, or the grayscale intensity, the standard deviation of the light intensity, and the integrated density.  The integrated density is the area times the average mean gray value, or light intensity.  The integrated density is therefore the light intensity of the entire cell.  I will have to analyze each cell individually since the data also quantifies the cells with activated HIV, so that's a pain.

Thursday 8/11
I met with Kathryn, my P.I., in the morning to discuss how things have been going in the lab.  I told her about my RELA experiment and that I had a great time learning and doing experiments in the lab.  We talked about if I could come work in the lab throughout the year, but she did say scheduling would be tricky.  I also just realized that I will be busy throughout the year with college classes and extracurriculars, so I will have to tell her that.  She is planning on taking another high school intern next year, but also wants me to return so we can learn together.  I can help the high school student as well as learn in the lab.  Kathryn also told me about Victor's paper on how RELA regulates gene expression since it's a transcription factor, and how Victor is trying to determine a way to predict how RELA will behave.  They plan to use this work to treat disease and things like that.

I also went with Victor to West Campus since he had to use some tech there for his experiment. Victor asked if I wanted to go with him on Tuesday, so I said yes.  We ate tacos at the cafeteria and met with another graduate student there.  The microscope Victor used was extremely powerful, with magnification capabilities of up to 100x magnification.  Victor was going to use the microscope for a long time, so I left at 3:30 because I needed to go home.

Friday 8/12
I got gifts for Victor, Laura and the professor.   I also got donuts for everyone, but only Andres and Victor Bass showed up to lab.  I just stayed in lab and worked on my personal projects.

Internship Review

Reviewing my summer goals that I set at the beginning of this internship, I did fairly well and got comfortable with openly asking questions to my lab members when needed. This has helped me be more familiar with the feeling of talking out more, though I did not participate in any presentations so far. (So my practice with my presentation skills has been minimal.) I really put a lot of effort into what I was learning and I got to learn more about the process these research projects go through. This summer, I was able to properly pace myself through the long-term project of creating a poster to present what I had learned and worked on. My internship goals of pacing myself and speaking out were completed.

This summer, what I really enjoyed was watching how other lab members worked through their projects. I loved shadowing them as they worked on the equipment or listening as they presented their projects. I really wish we had more presentation opportunities and more chances to see the process that other lab members go through.

Through this internship I learned how to work an image processing program, how to more comfortably speak up and ask questions, how to pace myself on a long term project, how different electron microscopes work, how to properly organize a research poster, and more about the nanotechnology field and what it entails.

Thursday, August 4, 2016

Introduction and Procedure Poster

Introduction:
Bulk metallic glasses are amorphous solids that are created using metallic elements. The word “bulk” is simply used to refer to good glass-forming ability. A pure glass is a solid that bypasses crystallization, which means it has no structure whatsoever. Even the slightest bit of crystallization (in other words, order in the atomic arrangement) disrupts the purity of a glass. Crystallization is defined as the point at which the crystalline volume fraction within the melt reaches some small but finite value.
This suppression of crystallization in order to form a glass is equivalent to the formation of a super-cooled liquid. This research examines the distances between particles that share forces in a super cooled liquid in order to construct a neighbor list.

Procedure:
First, one needs to create a code that constructs a neighbour list, keeping the temperature constant (NVT). This code needs to be run in a compute node in one of the Yale HPC clusters. Along with this code, scripts containing source code need to be run in the clusters to create the necessary files for the code. The output data should be directed into a folder so that it is successfully stored.

Lab Internship: Week 5

Monday 8/1
I began to research TNF and HIV in an effort to comprehend my experiment and data.  So far, I found two hypotheses that I need to look into: TNF activates the NF-kappa B pathway, activating the NF-kB receptors which HIV then use to infect the cell; HIV attaches to and infects cell via TNF receptors.

I made some more devices for Laura.  The lab got two new master slides for the flow-patterning devices and Laura did not really like them.  She said that they were too sticky.  I found that they were the same, mainly because I did not really care.  I found that the area in which I poured the PDMS produced the most bubbles, so I focused on pouring PDMS in empty spaces so I could pop less bubbles.  I also punched holes for the inlets and outlets for some flow-patterning devices for Laura.  It took approximately 20-30 minutes (I used music as a method to keep track of time).

Tuesday 8/2
I did some more research on TNF, but I also asked Victor about why TNF activated HIV.  He said that the TNF pathway activates NF-kappa B, AP1, and SP1 receptors, as well as the NF-kappa B pathway and the HIV uses that pathway to infect the cell and proliferate.  However, I also found a paper from France that explains how HIV actually takes over the TNF pathway.  Victor asked me to send him that paper, so I did.  I need to read that French paper, but I also need to learn how HIV takes over the NF-kappa B pathway.

I also ran the TNF experiment again.  However, this time I changed the amount of media in the devices at the end since Victor believes that we lost the cells because flow was insufficient.  I ran the same procedure: I plasma bonded the devices, I put DI water in the devices to ensure flow can occur, I primed the devices with RPMI, loaded my cells, washed the cells with RPMI, stained them with Hoechst, then loaded the control devices with RPMI and the manipulated devices with TNF.  I left them to incubate with a humidifier for 24 hours.  The trapping efficiency in the devices were not too great, but most of them were above 80%, so it should be fine.

Wednesday 8/3
Victor did not show up today, and will not show up for the rest of the week.  He said he can come in tomorrow and prepare cells for another experiment that I can run on my own.  I will have to prepare the cells properly and without assistance (Ay, so much pressure, hopefully I don't break anything).

I learned that HIV cannot use TNF pathway to infect cells, only the NF-kappa B pathway.  I began to read a paper on how HIV infects the NF-kappa B pathway in an effort to better understand what is going on inside the devices.

Victor forgot to put me in the schedule for the microscope...
Someone else scheduled the microscope for the whole day (11-6).  Now, I have to use another microscope, the Nikon.  I need to remember to put myself on the schedule for the EVOS microscope when I do the experiment again tomorrow.  This could've been really bad.  Thank god for Laura, saving my butt again.

Speaking of Laura, I cleaned some of her devices and punched holes in 6 of them now.  It used to be 4, but since we got the new master slide for the flow-patterning device, we have 6.  Hooray.  More work.  (punching holes in those devices take forever, like 30-45 minutes.  Thank god for music.)

Thursday 8/4
Victor did not come again today.  He said he has a family emergency.  Laura will be preparing my cells, but instead of stimulating them with TNF, we will be stimulating them with LPS.
LPS (lipopolysaccharides) are large molecules found in the outer membrane of gram-negative bacteria (bacteria that don't have much peptidoglycan--substance that forms the cell wall of bacteria).  It acts as an endotoxin and induces an immune response in humans when it binds to receptors, such as the CD14 receptor.  Downstream, LPS activates NF-kappa B via CD14 pathway, therefore it is likely that LPS will activate HIV in the cells.

While doing experimentation, I was supposed to make a solution of LPS and RPMI so the cells can be stimulated by LPS while living in a favorable environment.  However, I accidentally put in pure LPS into two devices and killed the cells.  Laura helped me get more LPS, and I made the solution for the devices.  I had to take out everything from the two devices I accidentally messed up.  The devices were still intact and undamaged, so that was good.  I redid the procedure for the two devices while the rest of the devices were still fine.  I constantly checked on the devices under each microscope to ensure that cells were still trapped and the device was functioning properly.  I then put a humidifier into the container with the devices at 1:25 pm, so by 1:25 pm tomorrow, I will need to image my cells.

Also, I realized that no one taught me the cleaning procedure for experiments.  I asked Laura, and she taught me what to toss out, what I can immediately throw in the trash and what I have to bleach first, and how to sterilize the workbench properly.

Friday 8/5
Most of the cells in my devices disappeared... once again.  After looking at my devices, I realized why the cells were disappearing.  The flow pressure from the large amount of fluid in the inlet caused cells to escape the traps and end up in the outlet.  I would like to try decreasing the amount of liquid in the inlets and outlets, but I do not want to put in so little that it could equilibrate so easily, which would disrupt flow.  Laura did not prepare the Hoechst stain properly, so I could not see the cells under the DAPI channel (filter to see stain).  I imaged what I could, but it is likely that I will not even bother to gather data from these images.  I did notice that LPS slightly activated HIV in some cells, while inducing apoptosis in others.

Friday, July 29, 2016

Lab Internship: Week 4

Tuesday 7/26
I just discovered that I was not supposed to culture cells, according to the safety advisor for the building.  So technically, I was not doing something that I was supposed to do.  Victor will now have to prepare all of the cells needed for experimentation.  Victor is planning on doing an experiment testing how TNF (tumor necrosis factor) affects cell behavior.  TNF is a cytokine that regulates many cell functions, such as proliferation, apoptosis, coagulation, etc.  In this experiment, Victor explained that the TNF will excite the cells and may cause them to jump around and possibly out of the traps.  I prepared the cells in four devices with Hoechst stain so we can identify the cells and I introduced TNF into two of the four devices, yet the trapping efficiency was not 100% in all devices.  Victor said we will have to image specific regions of the channel to see how the TNF affected the cells.  We just need to see the cells, so the trapping efficiency is irrelevant.

Wednesday 7/27
Victor asked me to accompany him to the stock room to get some supplies.  I assumed the stock room was in the basement of the building, but it was really in KBT.  The stock room was reminiscent of a corner store, like a drug dealer's grocery store.  Walter White would probably love the stock room.  There are two on Yale's primary campus: one at KBT and one in the med school.  There is also one at West Campus, but that's far away (at like West Haven or something).

There was also a lab meeting where one of the labmates, Linda Fong, presented her progress on her project on how p38, a mitogen-activated protein kinase, affects HIV latency and apoptosis of HIV infected immune cells, which could lead to more effective forms of therapy.

I also began to learn ImageJ to analyze the images of the devices I took with Victor.  I need to learn how to use ImageJ and develop a macro to count the amount of cells in the image and differentiate between the cells.  CellProfiler is another program I can use, but I do not know how to use it.  I asked AndrĂ©s for help, but he was busy with his project.

Victor and I were about to image the devices with cells for the TNF experiment, but we found out that the RPMI evaporated in the incubator and the cells died as a result.  Many traps in some of the devices also disappeared.  Victor guessed that this occurred because there was not enough humidity in the incubator.  Tomorrow, we will begin repeating the experiment by preparing the devices.

I found a way to analyze an image and count the cells.  I just need to use the Color Threshold function to "select" the cells and analyze the particles with the Analyze Particles function.  I then get a table full of data with the measurements of each particle.  I already tested this method with images from the efficiency experiment.  The method was accurate.  I will use this method with my experiment to count the cells and further test the efficacy of this method.

Thursday 7/28
I prepared four flow-patterning devices for Laura and her experiment.  I cut them out into individual devices, then I cleaned them.  I also punched holes in the inlets and outlets (20 inlets, 20 outlets for each device).

I prepared four more devices for my experiment (take 2).  I used J65C 6.6 for my experiment, not 4.4 since 6.6 was thinner and less clumpy and dense.  I prepared two of the four devices as controls, and the other two as the manipulated group.  I will stimulate the cells in the manipulated group with TNF to see how TNF affects cell behavior.  According to Victor, the TNF will excite the cells in a way that will make the cells jump out of the traps.  I will leave the cells in the devices, and leave the devices in an incubator for 24 hours.  I will also put a humidifier (fancy term for water in a bowl) with the devices to ensure that the RPMI will not evaporate so the cells do not die.

Friday 7/29
Most of the cells from the devices all disappeared.  We do not know why.  It was perfectly fine when we incubated the device.  Maybe the humidifier loosened the bonding of the devices?  We checked the outlet and everything, but nothing was different--besides the fact the most of the cells were gone.  It's not like they planned a prison break or anything.  Cells can't do that--I think.  There were a few cells I could image in specific regions of the devices, so I imaged them.  The control devices were plain blue because of the Hoechst stain, but the TNF activated HIV in almost 100% of the cells, so that made for some pretty pictures.  It almost calmed me of the stress and shame of having almost another failed experiment.  Victor told me this happened all the time, which just made me feel bad for scientists everywhere.  Now, I have to understand my data and learn what it means.  It's likely that the TNF was an activator for transcription, allowing HIV to be transcribed and become active in the cell, but it could also be that TNF activated a specific MAPK (mitogen-activated protein kinase) pathway.  MAPK pathways are involved in many cellular activities, such as gene expression, proliferation, mitosis, apoptosis, etc.

The Transmission Electron Microscope

When compared to a light microscope, the transmission electron microscope (TEM) may seem similar in concept, but there are a few things that have been changed. The light source is replaced with an electron gun, and the glass lenses are replaced with electromagnetic lenses. The entire chamber through which the electrons pass is put under a vacuum, to avoid electron collisions with air molecules, and magnification variety is achieved through different currents in the electromagnetic lenses (as opposed to just moving the glass lenses in a light microscope).

The electron gun is usually composed of a heated tungsten filament, from which there is an electron cloud. This makes up the electron source, from which the electrons are pulled, accelerated, and then focused onto the sample. The sample has to be really thin (most are approx. 0.5 micrometers!) for the electrons to pass through and form an image. When the electron beam hits the sample, multiple things can happen. They (the electrons) can be absorbed, scattered, or even reflected; depending on the thickness of the sample and its composition. The image is formed on a fluorescent screen using the information gathered form the different actions of the electrons. This image can then be photographed for later use or documentation.

(see article link in "All You Wanted to Know About Electron Microscopy...But Didn't Dare to Ask!" post)

-Patricia Acorda

Lab Update: Week 4

7/25 Monday - Worked on poster, asking lab partner for feedback on conclusion and title sections. Completed these sections and started writing up a new Methods section. Added references to photos. Went home early because of a stomach ache.

7/26 Tuesday - Finished writing up Methods section and sent poster to Apple for feedback. Using feedback, reformatted Results section again, trying to remove bullet points and labeling sections. Made minor edits to the Methods section and sent back for feedback. Attended a presentation on the other lab members' projects and participated in the ImageJ quiz made by the teachers, winning first place.

7/27 Wednesday - Awaiting feedback from Apple. Went to Hillhouse to fill out the CRISP program survey.

-Patricia Acorda

Internship Week 4

I have finished converting the NVE code into an NVT code. The major changes that had to be made were the order of the subfunctions and the contents of the “movea” and “moveb” subfunctions. I sent it to my mentor for revision and copied the revised file into both of the Yale clusters that I have access to-- Omega and Grace. The revised file rearranged some of my original subfunctions, added in some outputs, and adjusted the velocity to 0. I was having trouble compiling the code in both clusters; every time I issued the command I received an error message informing me that the compilation had been aborted. I worked with my mentor and we found that there were complications in using the “scp” command to copy the code into the clusters. We copied and pasted in manually, as well as copying source files, but we only did this in Omega as opposed to both Omega and Grace.
I read up about scripts and refreshed on the vi editor for about an hour. My mentor created some scripts and I copied them into Omega under a directory that I created named “Scripts”. He explained the scripts to me line by line and I ran certain parts of them according to my immediate goal. For example, I put a pound sign in front of the lines of entire script except the loop that created 16 files under my “scratch” folder. After the loop, the script issued commands to open and close 16 files, so I copied a source file named “input.txt” into the space between the commands so that it would be copied into the 16 files. After this was done, I put pound signs in front of these lines because their job had already been executed. I then removed them from the loop that created a file named “ignition2.tasks”. Next, I put the pound signs back and removed one from the line that specified parameters such as the requested amount of time, the memory, the queue, etc. Then, I submitted the job. There were two other jobs already running.
When I returned the next day, the job was complete. I experimented with some commands on the cluster and attempted to edit some of the code so that it would produce outputs in a file.

Sunday, July 24, 2016

Internship Week 3

I read about constraint methods in constant temperature molecular dynamics for advanced computer simulation techniques. An enumerated method to fix the kinetic temperature is to rescale the velocities at each timestep to  the square root of the current kinetic temperature divided by the desired thermodynamic temperature. This rescaling solves the set of equations of motion. It is common for configurational properties to be determined in a simulation while the precise kinetic properties are added in later.
I have been working on converting the NVE code (where energy is constant) to an NVT code (where temperature is constant). I have many references such as a C++ coding book, a psuedo code, and an NVT code written in Fortran. I expect to have it done by tomorrow so that I am able to run the program successfully in a Yale cluster.

Friday, July 22, 2016

Lab Update: Week 3

7/18 Monday - Went over Apple's feedback independently. Made minor edits to poster, mostly worked on reading up on lanreotide for the weekly article.

7/19 Tuesday - Went over poster feedback with Apple. Made major changes to the poster. Went back and revised my image processing system, using heavier filters and using a variety to be able to show the different results on my poster. Using a 9.0 pixel radius Mean filter instead of a 3.0 pixel radius Mean filter and a 3.0 pixel radius Gaussian Blur instead of a 2.0 pixel radius Gaussian blur before using a Default threshold gave much smoother results.

7/20 Wednesday - Worked more on formatting the Results section of the poster, using the images from the process created yesterday. Got feedback from a friend, who suggested defining more of the ImageJ vocabulary.

7/21 Thursday - Attempted to use the same imaging process on other images, but ran across a problem with analyzing the binary image created. Added more to the Abstract and reformatted the poster. Experimented more with different filters, trying to find a way to get rid of more grayish artifacts that were interfering with some particle outlines. Wasn't able to find a way to separate the two. Still stuck on how to analyze some of the particles.

7/22 Friday - Experimented more with analyzing particles. Played around with estimated particle size to try to get proper outlines of the lareotide nanotubes. Worked on conclusion, methods, and reference sections of the poster. Implemented feedback from friend and added a bit more explanation on some vocabulary.

-Patricia Acorda

Thursday, July 21, 2016

Lab Internship: Week 3

Tuesday 7/19
Today, I did some cell culturing and split my cell groups in a ratio of 1:5 (1 mL of 1 mil cells:5 mL of entire solution (cells and RPMI medium)).  This was the 29th passage, or the 29th time these cells have been split.  I took out 7.5 mL of cells from J65C 6.6 and left 2 mL in container to grow.  I placed the 7.5 mL of cells into a 15 mL falcon tube to spin in the centrifuge to get a pellet of cells at the bottom of the tube to run through three devices.  Once I got a pellet, I removed most of the old medium and added 500 microliters of new RPMI into the tube.  I then took the entire solution out of the falcon tube and into two cell strainers to ensure the cells were separate and not in clumps.  I ran them through 3 devices I prepared earlier.  After a few minutes, I removed the leftover media and replaced it with Hoechst, a fluorescent staining agent that stains the DNA in the nucleus of cells.  I then placed the devices in a styrofoam box to block off all of the light since the Hoechst is light-sensitive.  After 20+ mins, I removed the Hoechst and replaced it with media so the cells can survive.  I then checked the devices under the microscope to ensure that cells were trapped and no more cells were flowing, otherwise it would interfere with the imaging process.  After, Victor brought me to the EVOS microscope, a widefield fluorescence microscope used for cell imaging.  Victor showed me how to use the device, and I took pictures of the cells trapped in the channels of the devices.  I used multiple filters to tell us different things about the cells.  For example, the GFP filter shows if there is HIV present in cells by activating the GFP reporter attached to the end of HIV strains and making the cells appear green.  The TxRed filter shows if there are proteins that stimulate the production of HIV and make the cells orange.  The DAPI filter makes the cells' nuclei appear blue since it reveals the Hoechst stain.  I took the images and used ImageJ to observe the pictures on my computer.

Wednesday 7/20
Uneventful day.  I did not do anything new, just made some more devices for Laura so she can do her experiment.  I did make two masters (not the device themselves, just casts) for future use.  I also punched out holes in four flow-patterning devices for Laura using a drill press in a lab on the first floor.  It took approximately 30 minutes to complete.  For the procedure on how I made the devices, refer to Lab Internship: Week 2.

I also read articles on cell signaling, which is what Laura is focusing her project on.

Laura informed me that she and Victor will not be in tomorrow, but I will still come in so I can pass some cells (cell culturing).  After that, I plan on reading for a while before heading home.

Thursday 7/21
I split J65C 4.4 and 6.6 1:5 successfully on my own.  I did not die, nor did the cells or station catch on fire, so I think it was an overall success, unless I missed something that I did not realize.  I warmed up my media, I made sure everything was clean and sterile, I did not spill the cells or kill them on accident, I did not break anything, I pipetted the solutions multiple times to get rid of clumps of cells, I only made a single bubble (which is pretty good), and I put in the proper amount of media (8 mL to go with the 2 mL of cells to make 10 mL of solution).  The solutions should be at 200,000 cells per mL now.  If I messed something up, Victor will notice when he splits the cells on Saturday.

Laura is not here, she is actually on vacation.  I do not have much else to do besides do some readings and extra work that I have.  

Lab Update: Week 2

7/12 Tuesday - The focus of our research project was switched from iron sucrose molecules to lanreotide nanotubes. We were given the nessecary information and supplementary articles by the lab members working on the project and given a chance to watch the lanreotide nanotubes being imaged on the transmission electron microscope (TEM), where we were also allowed to try using the TEM ourselves. We were assigned to work with the images from July 7, 2016 and given time to freely experiment with different ImageJ functions. I created my first imaging process theory, recording my process in Word. Though my imaging results weren't that great.

7/13 Wednesday - I did more independent research on the importance of lanreotide for the purpose of my poster, which I managed to organize and complete a little bit of. Continuing from Week 1, we learned more about using ImageJ and how to properly process the images given to us. We were given more advice on what different filters did to images and the reasoning behind using filters and auto thresholding techniques as well as the importance behind the image histograms.

7/14 Thursday - I did more poster organization, beginning to actually create it in powerpoint and plan out what to write. I did more experimentation with ImageJ, really starting to understand the program after further instructions and advice from a program expert. I developed my second imaging process theory, this time with better results. 

7/15 Friday - I finished the Results and Discussion section of my poster using the second imaging process theory and sent the poster to Apple for feedback.

-Patricia Acorda 



Wednesday, July 20, 2016

Bulk Metallic Glass

 
Bulk Metallic Glass are a class of materials that offer many desirable and unique processing routes and properties. An improving universal description of the structure and behavior of BMGs should lead to better tools for their design and for commercialization. Metals and alloys normally exist as crystalline materials. Compared to crystalline, bulk metallic glasses often exhibits very high strength, corrosion resistance, and elasticity. BMGs with the best glass-forming abilities are Zr- and Pd-based, are often reported in terms of a “critical casting diameter”. 
Small changes in composition can lead to large changes in properties. Several rules exist correlating a single physical parameter with glass-forming ability. However, only some of these correlations are useful for predicting glass-forming ability. These rules still require knowledge of properties of the alloy to allow prediction of glass-forming ability. A better understanding of the atomic structure of BMGs is developing through analyses of the simplest binary alloys. The BMG structure is a randomly packed assembly of the different atoms. However, although BMGs lack long range atomic order, they do exhibit short and medium range order over several atomic lengths.
Compared to other engineering materials, BMGs are used in small volumes, but most are very strong in compression. Many engineered parts require good toughness and strength in tension rather than compression. Similarly, BMG composites have been well designed by matching the micro-structural length scale of the crystalline phase to crack length scales. It resulted in high tensile ductility and fracture toughness. A unique property of BMGs is their ability to reversibly transform from the low-temperature glassy state into a supercooled liquid state above a glass transition temperature. Many BMGs are not well adapt to conventional forming at room temperature. Forming in the SLR has been successful on most systems. Extrusion, closed-die forming of micro-components, and many more innovative techniques such as blow molding have also been demonstrated.
BMGs can be formed into very precise shapes while in the SLR.
Production technologies for crystalline metals have matured, but BMGs are in their first stage. A variety of casting techniques and fluxing strategies are now used to produce BMGs. It is said that BMG properties are dependent on cooling rate, so the choice of manufacturing route can be important. Gas atomization or mechanical alloying can be used to produce amorphous powders.

Biomimetic Organization: Octapeptide self-assembly into nanotubes of viral capsid-like dimension

Recently, scientists have been interested in the ability of certain simple molecules to spontaneously form into nanostructures because of their potential applications in biotechnology and materials science. Certain biological self-assemblies including tobacco mosaic virus, capsid proteins, tubulin, and actin are able to form long filaments under 100nm in length, but their high fabrication costs eliminate their potential for use in practical applications. A simple alternative based off of using simpler molecules has arisen, though it requires a deeper understanding of the relationship between the molecular structure and the self-assembly process of the nanostructures. Additionally, before now, no simple synthetic molecule was able to self-assemble into nanotubes in the length range of 20-30nm.

Lanreotide, an octapeptide synthesized as a growth hormone inhibitor, is an excellent example of a molecule able to self-assemble into a well-defined nanostructure. By studying this self-assembly process, researchers were able to reach a better understanding of the systems involved in the formation process. It was determined that diameter of the nanotube could be changed by modifications to the molecular structure and that the process investigates the minimal interactions in order to form large self-assembling nanotubes (as observed with the biological self-assemblies mentioned above). Conclusively, the exemplary system of self-assembling lanreotide nanotubes in water can still be investigated for further applications.

Biomimetic Organization: Octapeptide self-assembly into nanotubes of viral capsid-like dimension

-Patricia Acorda

Friday, July 15, 2016

Lab Internship: Week 2

Monday 7/11
Professor Miller-Jensen and her lab members made a presentation on microfluidic devices for high school students in the SCHOLAR program.  They asked me to observe and make notes on the presentation so they could improve it for next year.  I actually assumed the presentation started at 9:30 am because of a sign I saw outside the presentation room, but it was really at 11:15 am.  During that time gap, I helped Laura, a research assistant, and Victor Bass, a graduate student, prepare pH solutions for the high schoolers to test their devices on.  We prepared four solutions, one with a pH of 4, two with a pH of 7, and one with a pH of 10.  I used a pipette to put the solutions in containers labeled A, B, C, and D, one container for each solution.  The presentation was good, the students seemed somewhat interested in microfluidics.  Some even asked if they could intern at Professor Miller-Jensen's lab since I was interning here.  The professor also had the students do a hands-on activity where they make their own microfluidic device using filter paper and crayons.  Students would draw a path with a crayon and melt the wax to create a channel so the fluids cannot escape.

I asked Victor about the goal of my internship, and he said that he was mainly teaching me technical skills so that I know how to operate and conduct myself in a lab in preparation for the future.  He also wants me to make a passive-flow microfluidic device and test its efficacy for another experiment which will examine how heterogeneity affects cell-to-cell communication and the production of cytokines when a cell from the immune system (T Cell) is infected by HIV.

Tuesday 7/12
I made more devices by producing a PDMS solution and making molds of the passive-flow device. Victor let me do all the work by myself; he just watched me do it so I would not make a mistake.  I polymerized a chemical by mixing it with a curing agent in a 1:10 ratio (25 g of chemical, 2.5 g of curing agent).  After, I poured the PDMS into a container in the hood and used a centrifuge to get rid of the bubbles in the substance.  I had to make sure the centrifuge was balanced by placing a container filled with water that was the same amount as my PDMS so it would not tip over.  After using the centrifuge, I cleaned out the passive-flow device to make sure there was not any extra residue of PDMS present, otherwise it would mess up the device.  I poured the PDMS into the container holding the passive-flow device and baked the cast in an 80˚C oven for 2+ hours.  While the mold baked in the oven, Victor and I split my cells (J65C 6.6 and 4.4 -- just labels for the cells) in a 1:4 ratio (1 mL of cells: 4 mL of whole solution) to make sure they don't overpopulate.  In each container, there was 8 mL of solution with 1 million cells per mL in the solution.  I split the cells by pipetting the cells back and forth in the container to get rid of any clumps in the cell populations.  I then placed only 2 mL of cells in each container and took out 6 mL.  I dumped the 6 mL from 4.4 into a bleach container to kill the cells, but I put 6 mL of cells from 6.6 into a container so we could run them through the devices we already made.  I replaced the 6 mL I removed from the containers with 6 mL of RPMI, which is the medium we use to grow our cells in.  We then put the cell containers in an incubator so they could grow.  We put the sample of cells from 6.6 in a centrifuge so that we could get a pellet of cells at the bottom of the container.  This was so we could remove the medium from the sample and get 5 million cells in 1 mL of media (5 million cells is the optimal amount of cells to be tested in a microfluidic device).  Victor then used a pipette to place the cells into a cell strainer to ensure there were no clumps of cells, otherwise the traps would not work well.  Victor and I took two of the devices we made last week and cut out holes for the outlet and inlet so cells can flow through the channel.  We then treated the devices and microscope slides with oxygen plasma in the Plasma Etch so we could attach the devices to the slides and make the devices hydrophilic so fluid can flow through them.  After securing the devices, Victor ran distilled water through the devices to ensure flow could occur and hydrophilicity remains in channel.  Using a pipette, Victor and I removed the water from our devices and placed 50 microL of our cell samples into the devices.  We then observed the flow of the cells in the device under a microscope.

Wednesday 7/13
I talked with Professor Miller-Jensen about my view on the presentation and any notes or suggestions I had for her.  My main suggestions were to go more into detail and explain some concepts further since I believed many of the students were confused at some of the content.  I also recommended making visuals to help students picture how a microfluidic device works/flows so students can understand laminar flow better.  The professor told me about how she wished there could be a workshop so she could explain the concepts that go with microfluidics more in detail and have more hands-on activities for students.  It would definitely be a good idea; however, I am unsure if students will actually be willing to go to a workshop for multiple days to learn about microfluidic devices.

I made more PDMS molds of microfluidic devices with Laura.  Instead of making molds of the passive-flow device, we made some for the well chip and the flow-patterning device.  Still unsure of how they function, so I need to learn about that.  I followed the same procedure as I did with Victor on Tuesday, July 12th.  Laura and I punched holes in the devices for the inlets and outlets, and we cleaned the devices using tape to get rid of large sediment, as well as cleaning solutions and methods such as ethanol and sonication.  Sonication is the use of sound waves to vibrate substances, and in this case, it is used to clean the devices and to get rid of extra PDMS on the device.

There was a lab meeting on the discussion of a paper on the microbiome in the human gut and how a species of bacteria (B. dorei) with LPS (lipopolysaccharide) can inhibit immune education (immune system adapting to illnesses early on -- hygiene hypothesis).  Laura also discussed her progress on her project of studying cell signaling.  I did not understand a thing from the presentations.  I still need to learn more about science and the project since I only have high school biology as my background.  So far, all I was taught in lab was technical skills

Thursday 7/14
Victor and Laura did not have much planned for me today.  The professor actually has Victor working on projects for the rest of this week and almost all of next week.  However, I still managed to accomplish a few tasks.  I cultured and split my cells (J65C 4.4 and 6.6).  4.4 seemed more dense, meaning that there seemed to be more cells in the container, but Victor assured me that they were just larger.  Victor did not guide me, he was merely there for supervision and if I needed to ask any questions.  I successfully split the cells with little to no trouble.

Laura then asked me to make some devices with PDMS, so I did that, too--on my own.  Laura did not give me any instruction.  I successfully made two PDMS solutions to be used as molds for the well chip and the flow-patterning device.

Laura also sent me some reference articles and papers with background information on her project where she is studying cell signaling.  I am not completely sure on the specifics of her project, so once I finish reading the articles, I will be sure to ask her.

Internship Week 2

One of the readings I did this week provided foundational information about the computer simulations of liquids. My focus was on the sections of the literature that dealt with finite difference methods and the Verlet neighbour list.
The complete set of positions, velocities and accelerations can all be found by taking infinite derivatives of an initial value from one timestep to the next. However, this would not be able to produce entirely accurate figures because it does not account for the equations of motion that can affect this simulation. This is where a corrector step comes in, that “corrects” for these inaccuracies. Usually only one corrector step is executed because they are generally very costly. A successful simulation algorithm has characteristics such as quick, small in terms of memory, allow the use of a long time step, duplicate the classical trajectory as closely as possible, satisfy the known conservation laws for energy and momentum, and be simple in form.
The Verlet neighbour list is a process of taking measurements from a center particle to particles that exhibits forces on it. These particles are within a space referred to as a potential cutoff sphere. There is also a larger sphere around this cutoff called a “skin” that are close enough to the center particle that they have the potential to move into the inner circle and exhibit forces on it at a point in time. For this reason, their distances from the particle is measured as well. This list is updated at certain time steps, usually 10-20 timesteps.
At my internship this week, I have done a lot of reading to understand the computer simulations that are being done and the principles behind them, in addition to learning to code. I have read a sample code and am currently attempting to replicate it without looking at it. I am writing down the actions and purposes of each line of the code on and then looking at this “translation”, in a sense, and attempting to reproduce it solely from my knowledge of coding.

Monday, July 11, 2016

How Nanotechnology Works

           Nanotechnology has much to offer. To understand how it all works you must first understand how small a nanometer really is. A nanometer is one billionth of a meter, which is smaller than the wavelength of visible light. Or, a hundred-thousandth width of human hair. Nanotechnology deals with anything measuring between 1 and 100 nm. At the nanoscale particles do not behave as they should, they move erratically. Although troublesome to pick and place atoms, once you succeed with that, you can build and produce almost anything with exact precision. On the nanoscale, everything that you think you know, do or may not apply.
           Nanoparticles are so small that you can not see them with a light microscope. Nano scientists must use a scanning tunneling microscope or a atomic force microscope. Scanning tunneling microscopes use a weak electric current to probe the scanned material. Atomic force microscopes scan surfaces with an precise fine tip. Fortunately, both microscopes send their data back to computers, which can analyze the information and display it on a monitor.
           Nanowires and Carbon Nanotubes have sparked a particular interest in scientists. Nanowires are a string of wires with a diameter of about one nanometer. Electronic devices, like computer chips or cell phones, would improve with the help of nanowires. A carbon nanotube is a cylinder of carbon atoms. Their properties all depend on the direction and way they were rolled. With the right alignment of atoms, the carbon nanotubes material is hundreds of times stronger than steel, and six times lighter. Engineers are trying to make building material out of it, especially for planes and car. It would increase efficiency and strength.
          There are many products that are being benefited from nanotechnology. In older sunscreens, the particles were much larger, which is why the color was white. Newer sunscreens have nanoparticles of zinc oxide or titanium oxide which leaves no color behind. Also, scientist have created clothing that protects from UV radiation and stains. They coat the outside of the fabric with zinc oxide nanoparticles, same as sunscreen, for better UV protection. Other clothes have nanoparticles of tiny hairs that repel water and other materials.
           Nanotechnology is drastically going to change the world. Like in the world of "Star Trek" replicators can be a very real thing, it is called molecular manufacturing.  The goal is to place millions of atoms together by nano-machines, then, a desired product can be produced. Professor Richard Smalley explains that in order for molecular manufacturing to actually take place, there must be trillions of assemblers working together. With one assembler it could take up to millions of years. Eric Drexler believes the assemblers could first replicate themselves, then exponentially reproduce to manufacture products. Manufacturing costs would decrease, leaving consumer goods cheaper and stronger. Nanotechnology might have its biggest impact on the medical industry. By working on the nanoscale, one can attack and reconstruct the body. There are speculations that nanorobots can enter the body reversing the effects of aging and increase the average life expectancy.
            Before any production of nanotechnology is created, we must learn more about their materials and properties at the nanoscale. Elements behave differently at the nanoscale so, there is some concern about whether or not nanoparticles could be toxic. Doctors are skeptical because they are not sure if the particles could easily pass through the blood-brain barrier, a membrane that protects the brain from harmful chemicals in the bloodstream. Not only can nanotechnology be used for medical purposed but if harnessed properly, it may allow us to create more powerful weapons. The most important thing we do is to carefuly examine the challenges, risks, and ethical dilemmas.

All You Wanted to Know About Electron Microscopy...But Didn't Dare to Ask!

What Is Electron Microscopy?
One of the earliest and significantly powerful microscopes was a light microscope created by Antony van Leeuwenhoek. This microscope was made up of a powerful convex lens and an adjustable specimen holder. It's suspected that this microscope was able to magnify objects up to 400x, leading van Leeuwenhoek to discover protozoa, spermatozoa, and bacteria. 

The first limitation with van Leeuwenhoek's microscope was the quality of the convex lens, but that could easily be fixed by adding another lens to magnify the image produced by the first lens. This configuration gives us the compound microscope, the basis of modern light microscopes. Today's light microscopes can give us a magnification of up to around 1000x, but they were limited by the wavelength of the light used for illumination. Small improvements to magnification could be reached using a smaller wavelength of light (blue or ultraviolet) or by dipping the specimen and the front of the objective lens in oil (which has a high refractive index). 

Then, in the 1920s, accelerated electrons in a vacuum were found to behave just like light, except they had a wavelength about 100,000 smaller than that of light. This meant that they could be used for higher magnifications than light. Instead of being manipulated by glass lenses and mirrors, electrons could be manipulated using electric and magnetic fields; giving rise to the first transmission electron microscope (TEM) built by Dr. Ernst Ruska at the University of Berlin. 

The TEM has a resolving power of 0.1 nm, meaning that using the microscope, we are able to tell the difference between two points down to 0.1 nm (Points less than 0.1 nm apart will be seen as a single point). This is significantly powerful magnification, compared to the resolving power of the naked eye in proper light, which is 0.2 mm.

Saturday, July 9, 2016

Asymmetry in Crystallization Behavior

The article discusses the asymmetry between the critical heating and critical cooling rates of Vit 1, a glass-forming liquid. In this context, the word glass is not exactly used to refer to a transparent material used in windows or cups. This glass is an amorphous solid that is not at equilibrium as opposed to a crystal, a solid that is more ordered and at equilibrium.
It was observed that the heating rate required to suppress crystallization was significantly higher than the cooling rate required. The cooling rate was found to be 1 K/s while the heating rate was 200 K/s. This may be attributed to the fact that nuclei formed during cooling and heating are exposed to different growth rates, likely a general feature for metallic systems.
The significance behind bypassing crystallization is to allow the creation of a pure glass or super-cooled liquid; crystallization disrupts the disorder of the glass’ atomic arrangement. It is defined as the point at which the crystalline volume fraction within the melt reaches some small but finite value. Crystallization is identified when there is a rise in temperature, referred to as recalescence, caused by the release of the heat of fusion at the solid/liquid interface during crystallization. This leads to an increased heating rate.
In steady-state nucleation, the number of nuclei formed during the heating of a purely amorphous sample up to the liquidus temperature is the exact same as the number formed during cooling from the liquidus temperature down to the glass transition temperature. It was concluded that if any metallic liquid is quenched and forms an amorphous solid, it has to be heated at an extensively faster rate to combat crystallization.

Friday, July 8, 2016

Lab Internship: Week 1

During my first week at Kathryn Miller-Jensen's lab, I actually accomplished more than was expected.  I learned how the lab conducts experiments, how they grow their test cells, and how they make the microfluidic devices used in the experiments.  Victor Wong, one of the research students, taught me how to culture and grow jurkat cells-an immortalized lineage of T cells-so they can use those immune cells to test how they respond and communicate when introduced to HIV.  Victor explained that when introduced to HIV, cells responds and secrete differently due to biological noise, meaning that even if cells are genetically identical, they act differently because of gene expression and proteins.  Laura and Victor both showed me how to make the microfluidic devices used by creating a PDMS solution and making a mold of the device using a master slide that holds the design.  The PDMS solution was created by mixing a chemical and a curing agent together and spinning the bubbles out of the mixture with a centrifuge so the bubbles would not affect the device.  If bubbles were present, and they almost always were, we used a gas pump to pump out the bubbles and used a air hose to pop them.  Victor and Laura baked the solution in an oven at 80˚C for 2 hours and cut out the devices with scalpels and blades.  After, they stored them in containers for further use.  

-M.K.

ELISA Assay

The ELISA, or enzyme-linked immunosorbent assay, is a method used to measure antibodies, antigens, proteins, and glycoproteins.  The ELISA has been used for pregnancy tests, diagnoses of infections, such as HIV, and the measurement of cytokines, or proteins secreted by immune cells in response to pathogens.  The ELISA plate is coated with a capture antibody which is raised to be against the antigen being used in the experiment.  Next, the researcher introduces a sample of what they're testing, like a serum or blood sample, and the antigens in that sample are then attached to the capture antibodies.  Then, the researcher adds detection antibodies, which are labelled with enzymes. After, the researcher adds a type of substrate that reacts with the enzymes and turns into a colored product.  From that color, the researchers can determine the antigen concentration in each sample.

-M.K.

Shape Identification and Particles Size Distribution From Basic Shape Parameter Using ImageJ

           The researchers at Mississippi University developed an ImageJ plug-in that extracts the dimensions from any digital array of particles soon after identifying their shapes and determining their particle size distribution. The paper describes the plug-in's development and its application to food grains and ground biomass. It is discovered that the plug-in was applied successfully to analyze the dimensions and size distributions of food grains and ground Miscanthus particle images.
          This research deals with three main objectives: the development of ImageJ plug-ins, determining the effects if shape, size, and orientation, and demonstrating an application to samples. Plug-ins are useful because when trying to achieve your output manually, it can be very time consuming and crosses the line of user subjectivity. The computer vision method determined the distribution and amount of garlic, parsley, and vegetable ingredients in pasteurized cheese with an accuracy of over 88%, compared wth the sensory method. However, there are multiple methods used because quick and accurate particle size distribution analysis is most desirable; especially when dealing with granular or particle materials. With ImageJ one can map the actual particle to an equivalent ellipse and perimeter match. The study establishes that fitted ellipse dimensions produce relativity good estimates.
           The results and discussion explained multiple plug-ins that were tested. First, they deliberated that shape-based corrections factors that fit ellipse dimensions are essential for measuring linear dimensions. Fitted ellipses dimension better than bounding rectangles because it produces less deviation. It was then noted that for triangle shapes, the FMR forms other shapes for easy identification. These findings of shape dimensions leads to the development of shape identification strategy. Later, an image of a known geometric shape and measurement was drawn and tested. However, some miscalculations occurred with samples. It was found that to avoid misclassification further research would need to be conducted.
           The effect of particle shapes with getting the measurements in length and width was calculated using geometric reference particles through absolute deviation. It was concluded that the shape does not have any effect on mean absolute deviation, and the area was inversely proportional to the mean absolute deviation. They found that their results indicated good shape classification of the plug-in only with round particles. It was also observed a coincidence of arithmetic and geometric mean lengths. The normal distribution shaped curve was seen towards the left, allowing an increase since the length was so small. The final plug-in that was tested was a drawback of image processing method. However, it can be used only with separation of a mechanical system.
           The research conducted shows the great potential of ImageJ plug-ins proving to be efficient and reliable. They are ready to give solutions in specific needs to machine vision applications.
         
         

Thursday, July 7, 2016

Shape Identification and Particles Size Distribution From Basic Shape Parameters Using ImageJ

Abstract, Introduction, & Results and Discussion

In various fields, quick and accurate particle size distribution analysis is desired. Currently, many distribution analyses and created manually, which leaves room for inconsistencies and can be fairly time consuming. An ImageJ plug-in has created a way to use computer (vision-based) image processing as an alternative or replacement method for measurement, identification, and size distribution analysis.

Though ImageJ comes with a built-in option for analyzing particles, this option presents multiple problems; including not providing dimensions of practical interest (i.e. the length and width of the particles), large deviations from the actual dimensions when using the available dimensions from the bounding rectangle of the particle, and deviations influenced by the particle shapes. Thus, it was concluded that fitted ellipse dimensions be used as opposed to the rectangle dimensions. But, these dimensions still showed some deviation, such that they still need to be corrected for accuracy. The research work proposed deals with developing an ImageJ plug-in that is able to analyze particle size distribution by applying correction factors after determining the particle shapes and then accurately give the dimensions of the particles through corrected fitted ellipse dimensions.

The Results and Discussion section discusses several aspects of the ImageJ plug-in that were tested. One aspect is how using the major and minor axes of the fitted ellipse for dimension measurements gave more more accurate results than using the bounding rectangle dimensions. Through several analyses with differently shaped particles with different orientations, it was clearly determined that the dimension of fitted ellipse was a better option than the bounding rectangle for dimensions. However, shape-based correction factors were still needed for measurement of linear dimensions.

Secondly, shape parameters of particles of geometric shapes were also observed in order to develop the shape identification strategy employed by the plug-in. These shape parameters included reciprocal aspect ratio (RAR), rectangularity (RTY), and feret major axis ratio (FMR). Each gives specific values for different shapes depending on their parameters, which were implemented into the plug-in to classify the shapes of particles.

The shape identification and dimension measurement accuracy of the plug-in was then tested using an image containing a group of known geometric shapes and dimensions. This test showed a high accuracy for the calculations of the areas but a significantly lower accuracy for the perimeter calculations. However, it was discussed that the significantly lower accuracy for the perimeter calculations was not of high importance since they can be determined other ways and are not the primary focus of the plug-in. It was also observed that as shapes got smaller and smaller, after a certain point they could only be represented by a straight square in the image. This would lead to some misclassifications with samples, as shapes were represented by less and less pixels. The conclusion was that further research would have to be made in order to avoid misclassifications associated with  (1) overlapping shape parameters values, (2) smaller shapes being represented by few pixels, (3) irregular shapes that deviate from the test cases, and (4) overlapping particles.

After, the effect of particle shape, size, and orientation on the determined length and width were tested. The test for the effect of particle shape on the determined length and width showed small mean deviations across all shapes. The test for the effect of particle area (size) on the determined length and width showed that the deviations tended to decrease with an increase in particle area. Though the lengths of triangles and widths of ellipses showed increased deviation, this was not significant enough to show drastic effects on the determined length and width. Finally, the test for the effect of particle orientation on the determined length and width showed only minor variations with a gradual increasing trend with orientation angle.

Lastly, the plug-in was used to create a size distribution analysis with food grain and with ground biomass. Both tests proved the plug-in to be effective, and gave hope for its growing potential. ImageJ plug-ins have the power to provide tailor-made solutions to suit the specific needs of machine vision applications.

-Patricia Acorda

Wednesday, July 6, 2016

ImageJ

           ImageJ, created by Wayne Rasband, of the Research Services Branch, in Bethesda, Maryland has many unique characteristics. The image processing program is readily available for all users, license free, and runs on any operating system. Throughout its six years of development and redesign, ImageJ has been downloaded tens of thousands of times, proving to be a reliable image processing program.
            ImageJ program is limitless because of the availability of extensions: macro and plug-ins. For example, a macro can be written that acquires an image every ten seconds, then it stores it in a sequence. Plug-ins are external programs, which offers imaging processing capabilities that do not exist in the core of ImageJ. They have revolutionized ImageJ, bringing new ideas to the programs.
          The imaging capabilities of ImageJ are known to be unlike any other. Not only can it preform basic operations and manipulations, but it also includes more sophisticated operations like dilation and erosion. One of its strong capabilities is that ImageJ can run on multiple platforms, which is called Cross-platform.  Due to the large developer community, there are always new and upcoming plug-ins and operations being developed. If a file format is not supported, the user community usually can develop support in a few days. Although there is no support hot-line, the large user base may communicate through a mailing list. It is easy to access and there are many users available online for aid; one can go through the thorough manual giving you step by step instructions.
          Just like everything else in the world, ImageJ does have its faults. Sadly, the sophistication of commercial image processing program's algorithms, lead them to be drastically faster. Also, the program does require having little knowledge for installation and first steps. There is not on-site installation or training unlike commercial image processors. Since ImageJ is constantly being redeveloped and designed, bugs and "undocumented features" can find its way in. The unwanted bugs can taint your on-going project.
          ImageJ has attracted countless scholars and students because of its accessibility and efficiency. The program proves that imaging is on the boundary between a field of science and a field of engineering.
         

Friday, July 1, 2016

Generalized Selection, Innate Immunity, and Viruses Conclusion

After conducting their research, the scientists discovered that culturing VSV in MDCK cells led to populations of generalists, meaning that the viral progeny can adapt to the host and evade the immune system of similar related hosts.  Generalists can adapt to different habitats and function accordingly in order to survive.  The specialist population, or the viral progeny from the HeLa-adapted populations of VSV, were believed to be less fit in unpredictable environments than the generalist populations since they did not experience selection and did not adapt to the innate immunity of the host.

One of the factors of host breadth, or the ability to adapt to hosts, is cell-surface proteins.  Proteins on the cell wall dictate what viruses can attach to the cell and what viruses cannot.  The viruses that can attach are the ones that can perform viral replication.  VSV, the virus used in the experiment, is not affected by the proteins on the cell wall, meaning that they will have no problem entering the host.  This means that the proteins on the cell wall will not affect the fitness of the virus, only the immune system can affect the virus.

The results of the experiment showed the scientists that the evolution and selection of a virus are key factors in determining the fitness of a viral population.  Understanding how the environment affects a virus can show how a virus can maintain beneficial host-antagonistic alleles, or alleles that code for a virus's virulence, including the ability to evade the innate immunity, or immune system, and infect the host.  A competent IFN environment--an immunocompetent host with active interferons that activate the immune system and protect the host cells--influences VSV to maintain the beneficial alleles to survive and successfully replicate in the host.  However, a relaxed IFN environment, or an immunodeficient host, leads to the loss of the host-antagonistic alleles.

The viral progeny from the alternate-host populations lived half of its evolutionary life in the MDCK host.  However, the viral progeny did not efficiently adapt and maintain the alleles needed to evade the innate immunity because the VSV was not under the selective pressure long enough.  The VSV was cultured in differing environments, leading scientists to believe that the adaptation to multiple environments influenced the maintenance and expression of beneficial alleles, and ultimately generalization.  Despite this, the viral progeny did not have high fitness and did not survive as well as hoped in the immunocompetent novel hosts (PC-3 Cells).  The novel hosts were used as test sites to determine the fitness of the VSV.  The researchers believed that the maintenance of the host-antagonistic alleles led to the loss of alleles that would express for the growth of the VSV in 3 of the 4 alternate-host populations.

The scientists also concluded that only the innate immunity affects viral evolution, not the properties of the host.  They tested this with the VERO cells, or primate cells.  Scientists grew the generalist viral populations in MDCK cells, or canine kidney cells, and discovered that they also had high fitness in the VERO cells since the MDCK-evolved populations were able to evade the innate immunity of the VERO cells due to robust viral selection.

With their research, the scientists ultimately concluded that the innate immunity of a host impacts the evolution of a virus.

-M.K.

Integrated Circuits-Memory Grows Up

The groundwork for most mobile phones include non-volatile memories(NVM). These NVMs have contributed to the decreasing cost and power consumption of integrated circuits, allowing the cost for 1 gigabyte(GB) to near $1. NVMs incorporate flash memory; this adds a gate to existing MOSFET transistor geometry in order to control the containment and release of an electric charge, whose presence or lack thereof is the bit of information stored by the NVM. The NVMs consist of  a storage element and a selection device that addresses it in a memory array.
In December, many researchers gathered to discuss the issue of maximized scalibility. Companies like Samsung, Intel, and Toshiba proposed a variety of 3D structures that all stack several memory layers to increase the memory density per unit area of silicon in order to decrease the cost per bit. The challenge with 3D memory technologies using thin-film transistors is guranteeing an effective performance and reliability. 
There has been a shift to a two-terminal device rather than the standard three-terminal transistor to maximize storage. Researchers from Intel introduced the use of chalcogenide to address phase-change memory storage elements by exploiting the ovonic threshold switching effect. The chalcogenide alloys can replace transistors in accessing storage elements inside a memory array. In addition to this, other alternative non-volatile storage elements that do not use a floating gate have been developed. These include resistive random access memory based on metal oxides and charge trap memories such as SONOS. 
Al Fazio of Intel has delivered a warning that increasingly complex algorithims will need to be developed to make sure the flash memory behaves as anticipated in its most popular form NAND. He also recognized that 3D cross-point memories would be a vehicle to produce the high speed and low cost that can positively shape the memory capacity in storage applications.

Image Processing with ImageJ

Recently, the discipline of imaging has come to play an important role in multiple research areas, leading to many of the professionals in these research areas to use image processing software. One increasingly popular - though quite old - image processing software is ImageJ. ImageJ is well known for being a public domain software that can run on any operating system. In addition to this, ImageJ appeals to users with its imaging manipulation capabilities as well as its reliable and immense user community that creates a forever expanding web of ideas and innovation.

Being a public domain software means that ImageJ is free, and anyone can download and use it license-free. This automatically makes it a popular choice among the other image processing software, the current rate (at the time this article was written) of download being about 24,000 downloads each month. The fact that ImageJ is able to be run on any operating system is also quite a feat, making it more available to anyone who wishes to use it.

Unlike other image processing software, ImageJ has a vast amount of imaging capabilities, which are continually expanding. These capabilities include the support of all common image manipulations, basic and more sophisticated operations, mathematical operations, visualization operations, and surface and volume rendering. In addition to the core capabilities of the software, extensions are also available online made by other users. These extensions include macros, which make it easier to automate repeated tasks and plug-ins, which are external programs that offer additional image processing capabilities. These programs have made ImageJ extremely helpful to many researchers, especially in the science and engineering fields.

One aspect of ImageJ that really keeps it relevant to modern needs and well-updated is that fact that is has an immense user community that offers extensions to the software as well as guidance. Because ImageJ is a public domain software, there is no hotline for problems, giving way for users across the world to open up about different solutions and ask questions to each other. This has really contributed to ImageJ - despite being a 6 year old software - to continually evolve as time moves on and keep its functions updated with modern technology. As such, ImageJ is still used today in many research facilities, serving both the engineering and science fields well.

-Patricia Acorda


Thursday, June 23, 2016

Skills and Knowledge I Want to Acquire

Since I will be working in a scientific research laboratory, I would like to learn more about biology.  To be specific, I want to learn about biological noise and genetic diversity and how proteins affect gene expression.  I also want to know how research is conducted in a wet lab.  Through my internship, I also want to learn how to develop scientific presentations and how to present to a scientific audience.  I also want to be able to present my research via presentation board and make it attractive and aesthetically pleasing for the audience.

Personally, I want to fix my procrastination issue and do all of my work on time.  Procrastination has been a longtime issue for me, and if I do continue this behavior, it will only make things harder for me in the future.  I also want to improve my motivation and dedication to science through my internship.  I want to be more invested in science since I do want to pursue a career in this field.

-M.K.

Wednesday, June 22, 2016

Skills to Acquire

I would like to learn how to conduct myself in a lab. Having taken high school science classes before, I clearly know the basic safety protocol. However, I have not worked in a research lab and I hope to learn the appropriate measures to use. Not only that, but I would like to learn how to behave in a professional setting. It will help me acquire a taste for the working environment that will benefit me in the future. I hope to learn how to give scientific research presentations to a group of people. This skill is vital if I decide to into research or science in the future, which are both things I am interested in. I will work to be able to take these skills away from this internship.
 A personal skill I would like to improve is managing time. I would also like to improve my memory, that way I can remember to schedule my obligations and keep track of everything. Since I have a lot on my plate senior year, I want to be able to contribute the most I can to everything I do without spreading myself thin. In order to do this, I have to be able to allot time to everything through scheduling. This life skill will enable me to become a more organized, productive person.

Tuesday, June 21, 2016

Skills I Would Like to Learn

This summer, my big focus is my social and public speaking skills. This is both a professional and personal goal for me as it is very important and beneficial to have good relations in work environments and a calm demeanor while presenting research. (I tend to be on the panicking side.) As such, I would really like to learn how to fight through my stage fright and also overcome my shyness around new crowds. This involves getting used to putting myself out there and taking that "leap of faith" to speak out more often so it won't be so daunting eventually. It also includes having more confidence in myself and what I'm capable of. I should establish more confidence in my own research so I can present it smoothly as well as confidence in my already obtained skills so that I can comfortably do work and share my ideas.

My second focus is time management with long-term projects. Often, I have trouble spacing out my work properly so I can tackle huge projects in smaller sections. I find it easier to blast through assignments in one shot, so I don't have to worry about it later. But with assignments that are meant to take a long time, it's harder to do it all in one night. I need to learn how to effectively divide the work into "bite"able sections so I can conquer each section one by one. This includes better understanding how I work best as an individual so I can gauge what size sections I can best handle as well as organizational skills in working these sections into my schedule and planning them out so I can complete them in the designated times. This also includes a general willpower to actually get those sections done in the designated times without procrastination.

In conclusion, I particularly struggle with social skills, public speaking, and time management concerning long-term projects. I hope to practice these skills these summer so I can grow accustomed to using them and become comfortable enough to proudly showcase them as skills I have acquired. 

-Patricia Acorda

Friday, June 17, 2016

A beginner's guide to nanotechnology

Nanotechnology is becoming a more promising science with great potential in manufacturing. The existing manufacturing process produces a lot of waste and pollution. Many atoms are wasted in the process of creating a larger object and getting the desired output by taking away excess material. As a result, the products made to fit together are billions of atoms out of alignment. This misalignment causes faster wear, expensive lubrication, and mechanical breakdown. Since molecular nanotechnology involves the manipulation of incredibly small particles, it will allow manufacturing to become more precise and therefore cleaner and cheaper.

Currently, nanotechnology has gotten as far as scanning tunneling microscopy, which uses a sharp tip to move atoms around a surface. There is also research being done on how to make the production of single-walled, carbon nanotubes cheaper. This is important because these tubes can save the millions of dollars being used to mine materials such as steel or aluminum. They are lighter and stronger than steel, and can be produced in a lab. Scientists hope to develop a process where a nanoscale robotic arm with a “sticky” tip would multiply to create millions of itself. These arms would be able to place individual atoms at a much faster collective rate than placing the atoms in the desired arrangement one at a time.

Possible applications for nanotechnology include, as mentioned, the improved efficiency of the manufacturing process. It also has the potential to revolutionize medicine. At this scale, small devices can be created to enter the human bloodstream in order to repair damage at the cellular level. This specification of treatment can treat the problem directly, while something like chemotherapy weakens the entire body. 

Generalized Selection, Innate Immunity, and Viruses

Viruses are biological agents that hurt and can possibly kill living organisms.  By taking over the cells of the host, viruses can replicate and produce viral progeny.  In order for viruses to replicate, they must be able to survive the defense mechanisms for the host cell, or the innate immunity.  Examples of innate immunity are the immune system, the skin, and mucus.  In the experiment conducted, the defense mechanism studied was the immune system.  Viruses must be able to evade the immune system in order to reach the host cell and successfully replicate, otherwise they would get destroyed by the immune system before reproducing in the host cells.  In an immunocompetent host, interferons are produced and identify the pathogen and trigger the immune response.  However, there are cells that are immunodeficient, meaning that they do not produce interferons.  Viral selection occurs in immunocompetent hosts since the immune system is destroying the viruses that cannot evade it.  This selection weeds out the viruses that cannot evade the immune system, resulting in only the reproduction of the virus population that do have the ability to evade the immune system.  In immunodeficient cells, there is no selection occurring.  The scientists began to experiment to find the connection between the virus populations that replicate and the viral progeny and see if the innate immunity, or the immune system, affects how a virus evolves.


The scientists cultured populations of the vesicular stomatitis virus in immunodeficient and immunocompetent hosts.  HeLa cells (human carcinoma cells) were the immunodeficient hosts, and MDCK cells (canine kidney cells) were the immunocompetent hosts.  The scientists hypothesized that the virus that adapts and replicates in the HeLa cells would only be able to adapt and survive in only immunodeficient cells and related hosts due to the fact that there was no immunity-selective pressure present.  The researchers also hypothesized that the virus that adapted to the innate immunity of the MDCK cells would not only have a high fitness in the immunocompetent hosts, but also in immunodeficient hosts since there is no selection occurring.  The researchers also tested a mix of the viral population, exposing some to HeLa cells and exposing a separate population to MDCK cells, predicting that they would also have high fitness, or being able to evade the immune system and replicate.  To test their hypotheses, the scientists placed the viral progeny from each cell host into another set of immunodeficient and immunocompetent cells. LNCaP cells were the immunodeficient cells while PC-3 cells were immunocompetent.

After conducting research, the scientists discovered that the viral progeny from the virus populations cultured in HeLa cells could only survive in immunodeficient cells like its predecessors.  However, the viral progeny from the MDCK population only had moderate fitness in immunocompetent cells.  The viral progeny from the both cell types had different levels of fitness because each virion had different capabilities of evading the immune system since some of the population were exposed to immunity-selective pressure while the rest was not exposed. The scientists concluded that innate immune selection does affect the evolution of RNA viruses.

-M.K.

Thursday, June 16, 2016

A Beginners Guide to Nanotechnology

            Nanotechnology promises to be the next technological revolution. It will be cleaner, cheaper, and have more flexible manufacturing capabilities than anywhere in the world today. Nanotechnology means many things to many different people. It not only provides assistance to medical science, but to engineers and manufacturers. Nanotechnology is nothing new. If we look at the world around us, nature works at the most basis level is measurement, the nanoscale. Life as we know it is made up of tiny cells that dictate our every move. 
           Today scientists are studying molecular technology to manipulate specific atoms and use molecules to build things. Using a Scanning Tunneling Microscopy (STM), which uses a sharp top to push atoms around, it will allow the ability to pick and place individual atoms and molecules to create things with ease. Nanotechnology factories will forever be change the way factories are able to build things with little or no change economically and cleanly. Not only making cleaning products, but much smaller ones as well. Medical science will be able to creat devices small enough to enter the body's bloodstream to repair damage and treat diseases at the cellular level.

Monday, June 13, 2016

A Beginner's Guide to Nanotechnology

Currently, many scientists are working with nanotechnology, believing that it is the new future of technology. Rocky Angelucci's A Beginner's Guide to Nanotechnology gives readers a view of what nanotechnology really means for the future and the possibilities associated with this new technology. The article specifically focuses on "molecular nanotechnology" a specific branch of nanotechnology that deals with manipulating individual atoms and molecules to eventually build amazingly precise mechanisms.

So what exactly is the hype with being so precise, right down to individual atoms? Angelucci compares modern manufacturing techniques with molecular nanotechnology, emphasizing how - on a molecular level - modern machinery doesn't make pieces that fit well together at all, thus giving way to faster disintegration and costly lubrication. Molecular technology aims to change that by giving us the power to move individual atoms and molecules. Since atoms are the building blocks of everything around us, the acquiring of this technology would open almost limitless opportunities. With the current scanning tunneling microscopy (STM), scientists are already able to push atoms using a fine tip. In the future we can see technology being made so that we are able to move large amounts of atoms at the push of a button. And going further, this technology could be easily used in a factory environment, where millions of mini nano scale assembly devices will allow engineers to create things to a precision not possible with the naked eye and current technology.

This technology will be able to not only build smaller mechanisms that can enter tiny spaces, but it will also allow us to push space efficiency, making it possible to fit complex sequences into a small area, similar to a living cell. (Though reaching that complexity might take us a few years...or more.) Materials will become stronger and generally cheaper and engineering design will take on a fun size; so welcome to nanotechnology.

-Patricia Acorda

Thursday, June 9, 2016

Internship and Personal Goals

             One of my goals for this internship is to decide if I am interested in pursuing an engineering career. I'm aware engineering is a broad field, and the lab I will be working in does not define the entire career path. However, it would help me narrow down my interests in order to give me the head start that will help me in college. I will achieve this goal by being on top of everything from the beginning of my internship. I will ask questions as they arise, ensuring that I am retaining and understanding the information I am given. I will always be attentive and invested in the work that I do within the internship. Another goal I have is to build connections that can benefit me in the future. The internship in itself is an incredible opportunity, but it can also blossom into other opportunities through the connections I build. I can do this by making a good impression on my PI and the other adults working in the lab. I will be early and give my fullest effort on every task I am given. I will only demonstrate my best work through the internship, and maintain a good work ethic.
             A personal goal I have for myself is to cut the time it takes me to run three miles down by five minutes. I joined the cross country team last year, and I was not prepared for the season. This year, I will condition in the summer with the rest of the team in order to be properly prepared for the upcoming season and improve my times. The cross country meets are usually around three miles long, so if I could cut down my time it would greatly improve my competition performance.