Category: Blog


We often have the feeling that coffee from the café or commercial espresso machine is much more delicious than the one prepared at home using traditional methods. Well there can be many reasons underlying to this, which might cause difference in the taste of coffee. To help individuals understand how home coffees are different from coffees from commercial coffee makers, we have brought forth some reasons below.

espresso-machine

Tools can be blamed

The best single serve coffee makers available in the market are equipped with the most precious tools, making them perfect for coffee brewing. While on the contrary, traditional home brewing lacks the use of such tools making home coffee a bit different to what is served from the use of coffee makers.

Grinding

Grinders installed in the coffee makers are much large in size, and are efficient for production of an appropriate coffee. Moreover, with the use of such grinders a perfect taste and flavored coffee can be prepared which is not the case with traditional home coffees. Coffees at home are prepared on stoves which in no ways makes use of grinders to bring in the desired taste and flavor.

Professional Coffee Maker

For individuals who work on coffee machines at the commercial place, they are trained and highly proficient in preparation of coffee. On the contrary, coffee at home is prepared by amateur people who hardly have any knowledge about it. This really creates a difference in the taste of coffee, making coffee from coffee makers really delicious to drink.

Essential Ingredient: Water

The most overlooked ingredient in coffee is water. None of us think about this ingredient while preparing coffee at home, but it really plays an essential role. The amount of TDS in water has an impact, and this is very well taken care of by the commercial coffee makers. The professionals and coffee makers give due attention to the TDS in water in comparison to individuals preparing coffee at home.

Don’t Forget Fresh Coffee Is the Best

It is highly advisable to use fresh coffee beans for preparing coffee. The beans should not be older than two weeks of its roasted date. At home, it gets difficult for individuals to have fresh coffee beans all the time. However commercial coffee makers ensure to roast the beans just before preparing the coffee ensuring that individuals are served with the best coffee in the world.

Use of Best Machines

Traditional brewing involves use of oven and pan for preparation of coffee, which means a simple and plain coffee as per recipe of an individual shall be prepared. On the other hand, coffee prepared by commercial coffee maker is prepared using the best set of tools and equipment proficient in the job. Each coffee maker has a predefined recipe of preparing a coffee, which is perfect in taste, flavor and type to suit an individual. This is not all, the best single serve coffee makers also serve individuals with an option to change machine settings to prepare coffee as per their expectation.

The Last Words

While there are many differences in traditional brewing and commercial coffee makers, it is always good to keep trying new recipes at home and come up with some of the best coffees. However, the individuals who are die hard lovers of coffee can preferably choose from the best single serve coffee makers and keep enjoying the taste of coffee day and night. To make choice of the best, one should scroll down through different machines in terms of features, tools and make, and finally decide to choose the most appropriate one out of it.

 


When we run a dispersion model, we set up >500 receptor points (mostly populated cities across the globe) to measure simulated radionuclide concentration at specific time intervals.  This datafile contains the results of I131 and Cs137 measurements, at three hour intervals, at these receptor points.  Please see our study for more information on this dispersion simulation.

partMOM: DataPoke Models and Data – Chino(2011)

Source: Datapoke atmospheric dispersion simulation.

Download: We’re working to integrate our datasets into Google Fusion Tables and/or Pachube.  If you’d like to help please contact us.

Wiki: datafile: dp_chino

Commentary:  Check out the study of this data set!


A recently disclosed Tepco documentation indicates total emissions estimates of both plutonium 239 and neptunium 239 for the first 100 hours of the catastrophe.  This leaked Tepco document [19] suggests a release of 1.2 trillion bq of pu-238,pu-239,p-240 and pu-241 collectively and 76 trillion bq of Np-239 within the first 100 hours of the catastrophe. Our goal with this study included developing atmospheric dispersion plots of these emissions and modeling radionuclide concentrations at receptors worldwide.  We then publish these results to the partMOM application for public analysis.

Method

In order to develop a time frame of the emissions, we utilized the temporal framework established by Chino M., et al. [7]. Specifically, we assumed the first 100 hours of the catastrophe to include March 12, 2011, 01:00 – March 16, 2011 05:00, adjusted to UTC time.

Similarly, we implemented release ratios identical to those of I-131, as established by Chino(2011).  That is to say, for the period 3/12 01:00 – 3/14 14:00 (61 hours total) we assumed a release of 2% of the total emissions of both pu-239 and np-239. The following table outlines the release ratios for the duration of the 100-hour emission interval:

After the initial 100 hours, we assumed emissions of pu-239 and np-239 did not continue.  We acknowledge this may not represent an accurate emissions profile for the isotopes as it’s likely the isotopes were continuously emitted even after the initial 100-hour emission interval.

We utilized the open source software FLEXPART for all dispersion models. We used standard FLEXPART configuration. Our transport bounding box extended from pole to pole essentially including the entire Northern and Southern hemisphere. Simulations utilized 0.5 deg GFS weather data and a total particle population of 5 million.  Convection was not accounted for (lconv=0).

Read: Biometric Gun Safe 2018 Reviews – The Most Comprehensive Guide

We added a species definition to FLEXPART for both pu-239 and np-239. We assumed both isotopes were completely volatilized and had properties similar to volatilized Cs-137. A link to both species definitions follows:

Neptunium 239 quickly (2.3 days) decays into Plutonium 239.  FLEXPART does not account for beta decay and it does not suffice to simply drop the Neptunium isotopes from the model after 2.3 days (via a half-life parameter or age spectra definition) because the result of the decay (Pu-239) is of great interest. Instead, we omit the half-life of Np-239 from the species definition and use the resulting concentrations of Np-239 to proceed with decay chain calculations.

Results

In order to determine projected radionuclide concentrations,within the FLEXPART model, we defined >500 receptor points worldwide.  No model validation has taken place due to the absence of any, as far as we know, publicly disclosed Plutonium or Neptunium measurements.

Additionally, The ~2000 pFLEXPART plot maps establish an extensive visual diagram of the radioactive contamination dispersion.

In contrast to most original scientific studies, the aim of this project does not focus on publishing the consensus of a handful of scientists.  Rather, We aim to present the technical data, to a worldwide community, for the purpose of encouraging open commentary.

Please have a look at the published data and plot maps:

partMOM Application: Modeling Pu-239 dispersion with FLEXPART

Sources Of Error

Translation. We depended entirely on outside sources for translation of the leaked Tepco documentation.  There remains some possibility that the document has been interpreted incorrectly.

Release Rates. We made several assumptions as to the release rate of both isotopes.  The release rates represent our best educated guess at actual release rates.  As no physically measured release rates currently exist, our release estimates may contain errors.

Physical Measurements. No validation of FLEXPART dispersion maps took place due to the absence of any published physical measurements of the isotopes modeled.

Conclusion

It remains difficult to determine the prevalence (or lack thereof) of Pu-239 or Np-239 as Japanese and American officials have disclosed few if any measurements of the isotopes.…


Our goal involved developing contamination dispersion plots of the radionuclides emitted from the Fukushima Daichii Nuclear Power Plant; after which, we compare the simulation results to physical measurements taken at CTBTO monitoring stations located worldwide. Finally, we publish all data to the partMOM application for public analysis.

Method

We took Initial I-131 source term estimations from Chino M., et al. [7]. We developed Cs-137 release rates from the I131/Cs137 ratios published by Chino[7]. All datetimes in the Chino publication were converted to UTC for comparison to other data sets within the partMOM application. After April 5, 2011, a release rate of 1 billion bq/hr (for both I131 and Cs137) was used for the remainder of the simulation. We derived this release rate from the I131/Cs137 ratios for the beginning of April (Chino[7]) and the published release rates of Cs137 through June (TEPCO [11]).

We utilized the open source software FLEXPART for all dispersion models. We used standard FLEXPART configuration. Our transport bounding box extended from pole to pole essentially including the entire Northern and Southern hemisphere. Simulations utilized 0.5 deg GFS weather data and a total particle population of 5 million.  Convection was not accounted for (lconv=0). FLEXPART accounts for I-131 half life in its species definition file. Similarly, standard FLEXPART installations omit the half life of Cs-137 – a reasonable assumption considering the half life of 30+ years.  Accordingly, We did not activate the agespectra feature.

Results

In order to determine projected radionuclide concentrations,within the FLEXPART model, we defined >400 receptor points worldwide.  Our defined FLEXPART receptors correlate with locations for which we maintain physical, dose-rate or concentration, measurements. This allows a comprehensive analysis and validation of the FLEXPART model output.

Additionally, The ~8000 pFLEXPART plot maps establish an extensive visual diagram of the radioactive contamination dispersion.

In contrast to most original scientific studies, the aim of this project does not focus on publishing the consensus of a handful of scientists.  Rather, We aim to present the technical data, to a worldwide community, for the purpose of encouraging open commentary.

Please have a look at the published data and plot maps:

partMOM Application: FLEXPART model Utilizing Chino (20111) source terms.

We maintain a separate partMOM application into which we supply an ongoing analysis of this study. We encourage you to open a new partMOM application and begin an open analysis of your own.

Here are a few of the initial findings:

Initial review of the FLEXPART model output included comparison to published CTBTO concentration measurements.  FLEXPART output, in most cases, showed lower concentration levels than those published by the CTBTO.

Here in Takasaki, JP the levels of Cs137 and I131 were smaller,  by one order of magnitude, than CTBTO measured concentrations.

CTBTO publishes concentration measurements for Takasaki, JP when no concentrations are projected by the FLEXPART model output.

March 18, 2011 Sacramento, CA. FLEXPART output indicates concentrations of I131 (2) orders of magnitude lower than those published by the CTBTO while FLEXPART Cs137 level are consistent with those published by the CTBTO.

Upper altitudes (altitudes >32 meters above ground level) showed radionuclide concentrations several orders of magnitude higher than near surface concentrations.  This indicates physical measurements, taken at monitoring stations near ground level, may represent only a fraction of the total concentration of radionuclides dispersed over a location.  A more accurate assessment of radionuclide dispersion requires measurements at several different altitude levels.  Specifically, disclosure of helicopter or deploy-able monitor measurements would provide a much more detailed account of the contamination dispersion.

Sources of Error

Model Input Parameters. Small changes to input parameters can profoundly effect the FLEXPART model output.  For example,  increasing the lsync value by 6 times corresponds to an increase in concentration readings (at long range receptor locations) more than 4 times.  We used standard FLEXPART parameters (see COMMAND file above) but the importance of input parameters should not be overlooked.

Model errors.  FLEXPART uses probabilistic algorithms to predict particle transport.  It is possible the FLEXPART model contains some inaccuracies which would, in turn, produce errant concentration approximations.

Source Terms. The exact emissions from FNPP are not known.  Chino(2011) uses deductive algorithms to evaluate source terms – but the fact remains that no direct measurements, of the emissions at the FNPP reactors, were taken (or if they were they have not been publicly released). Any statistical or algorithmic errors in the evaluation could lead to errant source terms.  Furthermore, Chino(2011) does not consider the possibility of emissions from the spent fuel pools at reactors 2,3,and 4 – this possibility remains widely debated.  It should be pointed out that a “core melt on fresh air”, at spent fuel pool 4, has been specifically mentioned in an Areva report to the Japanese government.

Measurement Inaccuracies. It is possible the the physical measurements taken at monitoring stations contain some errors or that actual valid measurements have been withheld.

Weather Data.  We utilize GFS data at .5 degree resolution.  This resolution is sufficient for long range and meso scale dispersion forecasts; however, local weather data is preferable for short range transport and deposition modeling.

Conclusion

FLEXPART output consistently showed concentrations 1-2 orders of magnitude lower than those published by the CTBTO.  We cannot explain the conflict, between modeled and measured data, using model error solely.  We consider errors in the source term evaluation as one possible source of these conflicts. Our ongoing research includes utilizing source terms 2-3x greater than those published by Chino(2011).

Particle dispersion and deposition models, such as FLEXPART, currently do not exactly reproduce physical events.  That is to say, the model provides a statistical estimation of the movements of a particle in time and space which may differ from the actual physical movements of the particle.  If we maintained exact physical measurements of these particles, throughout their lifetime and travels, we would not need a dispersion model.

Models, utilizing statistical algorithms to approximate the properties of a population, present the potential for errors. It’s important to note, statistical models accurately reflect reality only if both the underlying algorithms and the parameters supplied to the model are correct.

Read – Study: Modeling Fukushima NPP Pu-239 and Np-239 Atmospheric Dispersion

The FLEXPART model (and others like it) provides a framework upon which we can both  develop ideas about both the travel and dispersion of radionuclides and forecast populations and locations with a potential for exposure to high levels of contamination. As we observed with the model of the Namie evacuation, FLEXPART revealed locations, along the existing evacuation route, subjected to high levels of contamination – this leads to very helpful inferences such as: “Evacuating [Namie town] to a location 15km Northwest may not be such a great idea – a better idea may be to visit your aunt, in Niigata prefecture, on the west side of the island”.

We contend that particle dispersion models, such as FLEXPART, should play an important role in both forecasting the dispersion of contamination and identifying “hot spot” locations exposed to high levels of contamination.…


Air Quality Data (AQ) collected by the California Air Resources Board (ARB). The ARB measures Air Pollutants at >180 locations across California.

partMOM: California AQ App

Source: California Environmental Protection Agency

Download: We’re working to integrate our datasets into Google Fusion Tables and/or Pachube.  If you’d like to help please contact us.

Wiki: datafile: california_air_quality

Commentary:  Notable air pollutants measure by the ARB include:

Black Carbon, Carbon Dioxide, Carbon Monoxide, COH, Hydrogen Sulfide, Light Scattering, Methane, Nitrogen Dioxide, Nitric Oxide, NOx, NOy, Non Methane Hydrocarbons, Ozone, Total Hydrocarbons.

Of particular interest:

ARB also measures Particulate matter with aerodynamic diameters less than or equal to 10 microns and 2.5 microns, PM10 and PM2.5 respectively. Off the top of my head, I don’t recall the diameters of the particulate emanating from FNPP – but this is something to look into.

Sulfur Dioxide and PM10 – sulfate is also measured.  According to the paper by  Priyadarshi A., et al. [13] both of these compounds provide evidence of neutron leakage from FNPP. I haven’t scoured the data but its there for anyone who’s interested.

We are looking for more Air Quality Measures from around the country! (and the world for that part).…


As part of the Comprehensive Nuclear-Test-Ban Treaty, an International Monitoring Network was set up to measure radionuclide concentrations in the atmosphere.  The data is aggregated by the CTBT Preparatory Commission and disseminated to the participating countries.  The CTBTO will not publish the data – only the participating country can choose to publish the measurements. Currently very few countries have the guts to publish this data – Germany is the source of this material.

partMOM : CTBTO app

Source: The Federal Office for Radiation Protection in Germany (BFS)

Download: We’re working to integrate our datasets into Google Fusion Tables and/or Pachube.  If you’d like to help please contact us.

Wiki: datafile: ctbto_air_sampling

Commentary:  A useful set of data that contains I131 and Cs137 readings from ~15 locations worldwide. I usually use this data set as a preliminary validation of any dispersion plots.  The fact that the data did not have to be disclosed, but was released by choice, leads me to believe that it may be one of the more transparent data sets currently available.…