# String theory and string cosmology

There are several potential projects in CoPS suitable for the Masters students to work on as their degree projects. They are classified in the four following categories:

» Cosmology

» Particle Astrophysics

» String Theory

» General Relativity

Cosmology: (Rahman Amanullah, Ariel Goobar, Edvard Mörtsell)

Cosmo-1: Spectroscopic SN survey [simulation]

SDSS III will should find a large number of spectroscopic SN detections. What can this data set used for? The characteristics for a spectroscopic survey are very different from traditional searches. The different spectroscopic signature between SNe and galaxies may allow detection for lower SN/galaxy brightness ratios. This could in particular be useful for finding SNe in early-type galaxies, where the spectroscopic signatures differ even more than for late-type hosts.While a traditional search is simply a matter of detecting a brightness increase on an already bright background, a spectroscopic search becomes the problem of separating a superimposed spectrum in its eigen vectors.

Cosmo-2: SN typing efficiency for lensed SN cluster survey [simulation]

MC simulation of SNe in ongoing cluster survey in order to determine: detection efficiencies and photometric tying efficiencies. These will be essential for computing the controll time and, later SN rates.

Cosmo-3: SDSS SN Host Galaxies (a) [data]

Measure the surface brightness of type Ia SNe hosts. This can be used for doing cleaner host galaxy subtraction of our spectroscopic sample, but also for investigating if there is a correlation between this and the SN properties (spectroscpic and photometric). It may also be worth doing PEAGASÉ fitting to the host photometry and do an analysis a la Sullivan et al. (2006) and Howell et al. (2008).

Cosmo-4: SDSS SN Host Galaxies (b) [data]

Study the SN Ia radial distance from its host galaxy core vs SN colours for the SDSS sample. This could be a way to study host galaxy extinction separated from the intrinsic SN Ia colour law, by assuming that the probability for host galaxy extinction has a radial dependence. Compare to existing dust models (e.g. Commins).The sample could also be divided into active and passive galaxies.

Cosmo-5: SDSS Host Galaxies (c) [data]

None of the attempts to study SN Ia host galaxy properties from a spectroscopic view have been satisfactory. The main goal behind this is to use host galaxy spectroscopy in order to break the age-metallicity degeneracy, which is generally difficult and requires high S/N data. None of the published work has attempted to estimate the systematic uncertainties involved in there different approaches.

Cosmo-6: Cosmology with Type II supernovae [simulations]

In the future, in addition to the Type Ia supernovae normally used for doing cosmology, we expect to find large numbers of Type II supernovae. The question is if and how these can be used to constrain cosmology. The student should check what kind of cosmological constraints we can obtain with future data using the distance-redshift relation as derived from Type II supernovae, taking into account the intrinsic dispersion, the possibility to calibrate the luminosities, the redshift distribution, the magnitude limits of the surveys etc.

Cosmo-7: Cosmology with tons of low quality data [simulations]

With future surveys, we expect to find a very large number of Type Ia supernovae, but will not have the resources to obtain spectra for all of them, and thus not get the redshifts (if we do not know the host) or do a secure typing. How can we do cosmology with such data sets? How often do we expect to know the host redshift? How large is the problem of only having photometric redshifts and/or not knowing the type?

Cosmo-8: Calibrating the BAO peak (and Type Ia supernova) measurements [data, simulations]

Is it possible to improve cosmological results by introducing subclasses of galaxies used for measuring the Baryon Acoustic Oscillation (BAO) scale and/or the Type Ia supernovae used to probe the redshift-distance relation? This could be the case if what one loses in terms of statistics is gained in terms of sharpening the standard candles (in terms of supernovae) and a better control of the biases (in terms of BAO).

Cosmo-9: Supernova rates (a) [data, statistics]

In Kuznetsova et al. (arXiv:0710.3120), the rate of Type Ia supernova up to a redshift of 1.7 is measured using Hubble Space Telecope data. They employ a so called Bayesian approach to calculate the probability that an object is a Type Ia supernova. However, this probability depends on the prior fractional probability of supernovae of Type Ia that is used, and therefore it seems that the obtained rates will be proportional to this prior. The student will verify that this is indeed the case and show how the results will change when the prior is changed. For extra credits, the student can check if, and how, it is possible to avoid this prior.

Cosmo-10: Supernova rates (b) [data, simulations]

In Dahlen et al (arXiv:0803.1130), Hubble Space Telecope data is used to derive the Type Ia supernova rate in four redshift intervals in the range 0.2<z<1.8. They find a drop in the rate at high redshift, suggesting a long time delay between the formation of the progenitor star and the explosion of the supernova. The derived rates depend on what is assumed for the dust extinction of Type Ia.supernovae. Dahlen et al. uses three different models out of which at least one is parameterised in terms of the so called reddening parameter, R(V)=3.1. The student should check how the result will change if we allow for lower values of R(V), as suggested from other data.

Cosmo-11: Limits on dust from x-ray halos and supernova Ia cosmology [simulations]

In Petric et al (arXiv:astro-ph/0609589), the non-detection of an x-ray dust scattering halo around the quasar QSO 1508+5714 was used to put an upper limit on the density of diffuse, large-grained intergalactic dust. This should allow us to put constraints on the extinction of SN Ia in the Hubble diagram. The student should check how powerful these constraints are, if they are sensitive to assumptions made on the properties and the distribution of the dust and if and how limits can be improved.

Cosmo-12: Solving the TeV gamma ray crisis [simulations]

Since large energy photons are expected to scatter of the infrared background, we should not be able to observe distant sources at TeV energies. However, such sources are in fact observed (the “TeV gamma ray crisis”). Could this be due to photons oscillating back and forth to axions and thus evade the IR background?

Cosmo-13: Constraining photon-axion oscillations using GRB afterglows [data, simulations]

Photon-axion oscillations may cause high-redshift objects to appear dimmer. A possible signature of this effect is a characteristic wavelength dependence. Therefore, oscillations can be constrained using the observed colours of quasars. Another possible route would be to use observations of the nearly featureless spectra of GRB afterglows to constrain the effect.

Cosmo-14: Probing curvature with cosmological distance measurements [simulations]

It is well known that the curvature of the universe and the properties of dark energy show strong degeneracies. It is therefore important for dark energy studies to constrain the curvature. Currently, the best constraints come from the microwave background, but these results are degenerate with the value of the Hubble constant. The only direct way to probe curvature is to compare coordinate distances with angular or luminosity distances. Angular and luminosity distances come from supernovae, baryon acoustic oscillations etc. Coordinate distances are harder to measure but could potentially be constrained from ages of galaxies etc. The student should check how good data we need to obtain useful constraints on the curvature and what limits we can obtain (if any) with current data.

Cosmo-15: An asymptotic model of w(z) and q(z) [model fitting]

In Hannestad and Mörtsell (arXiv:0407259), constraints on the evolution of the equation of state of the dark energy, w(z), is obtained using a parametrisation of w, which has the advantage of being transparent and simple to extend to more parameters as better data becomes available as well as being well behaved in all asymptotic limits. The student should update this analysis using current data. A similar approach can be used for the deceleration parameter q(z).

Cosmo-16: Holographic Dark Energy Model [model fitting]

Check for observational constraints on the Interacting Holographic Dark Energy Model of Setare and Vagenas (arXiv:0704.2070).

Cosmo-17: Interacting dark matter-dark energy models [model fitting]

Constrain the strength of the interaction of dark matter and dark energy using cosmological data. Useful reference: Amendola et al. (arXiv:0610806).

Cosmo-18: The SCP transient and microlensing [data, simulations]

An unusual optical transient was discovered during the Hubble Space Telescope Cluster Supernova Survey (arXiv:0809.1648). The transient brightened over a period of ~100 days and then declined over a similar timescale. Since the transient's spectrum is inconsistent with all known supernova types and does not match any known spectrum, it is suggested that it may be of a new class. It is also claimed that the shape of the lightcurve is inconsistent with microlensing. However, the latter is not quantified in the paper. The student should check why this is the case and if the lensing interpretation may be valid if one includes the extension of the source etc. Possibly, it would be nice to obtain some general results that, e.g., couple the concavity/convexity of a lightcurve to the possibility of lensing.

Cosmo-19: Bose–Einstein condensate dark matter and gravitational lensing [model fitting]

In Böhmer and Harko (arXiv:0705.4158), the possibility that the dark matter could be in the form of a Bose–Einstein condensate is considered. The authors compute the deflection of photons passing through such dark matter halos. The bending angle obtained for the Bose–Einstein condensate is larger than that predicted by standard general relativistic and dark matter models. Therefore the study of the light deflection by galaxies and the gravitational lensing could discriminate between the Bose–Einstein condensate dark matter model and other dark matter models. Since the authors does not do that, it would be a good project for a student.

Cosmo-20: Inhomogeneous cosmology [theory, simulations]

There has been a lot of interest lately in models where the apparent acceleration of the universe is due to the fact that the universe is inhomogeneous and that dense regions expand slower then the average and thus, as time goes, larger volumes of the universe is occupied by regions with smaller densities. Syksy Räsänen has worked a lot on this. The volume averaged expansion rate may thus have an apparent acceleration. The student should check how large density contrasts are needed to get a large enough apparent acceleration and compare this to the density contrasts observed. For extra credit, the student could check how the apparent acceleration affects cosmological observations such as supernova Ia distances using, e.g, ray tracing simulations.

Cosmo-21: A cosmological constant consistency check [data, model fitting]

Weak lensing and galaxy cluster counts constrain the matter density and matter fluctuation as parameterised in the sigma(8) parameter. sigma(8) can be calculated theoretically for a given cosmology. It has been suggested that a comparison of the value of sigma(8) one gets from lensing and clusters with the value obtained in the cosmology favoured by the microwave background, baryon acoustic oscillations and supernova data provides a nice consistency check for the standard cosmological constant model. This has not been done quantitatively however, and it remains to show whether such a comparison actually could provide useful limits on, e.g., dark energy properties such as w(z).

Cosmo-22: Type Ia supernovae and compact objects in the universe [data, model fitting]

The possibility to put limits on the fraction of compact objects in the universe using the observed magnification distribution of Type Ia supernovae has been investigated in, e.g., Mörtsell et al (arXiv:astro-ph/0103489) and Metcalf and Silk (arXiv:astro-ph/0612253). In the latter, it is claimed that the data favours dark matter made of microscopic particles (in contrast to macroscopic compact objects) at 89% confidence. This analysis can be improved by including lensing from galaxy size halos, newer data etc which could potentially improve results. However, a realistic treatment of possible systematic error sources that may also skew the luminosity distribution might make results less constraining.

Particle Astrophysics (exp. & theo.): (Lars Bergström, Jan Conrad, Joakim Edsjö)

AstroPart-1: Detecting gamma-rays from space with Multivariate methods [simulation, data, exp]

The Fermi Gamma-ray Space Telescope (formerly called GLAST) is a satellite mission for measuring high-energy gamma-rays from space, launched in summer 2008. The project is an international collaboration, involving institutions from France, Italy, Japan, Sweden and USA. The effective energy range is 10 keV - 300 GeV. Charged particles form a overwhelming background to the gamma-rays and need to be efficiently reject. Modern multivariate techniques (which are also used in other fields, like character recognition or finance) can be used to achieve this goal.

AstroPart-2: Search for Dark Matter sources in Fermi data [data, exp/theo]

The Fermi Gamma-ray Space Telescope (formerly called GLAST) is a satellite mission for measuring high-energy gamma-rays from space, launched in summer 2008. The project is an international collaboration, involving institutions from France, Italy, Japan, Sweden and USA. Fermi has discovered many sources which can be associated with sources known from observations at other wavelength (for example Active Galactic Nuclei). A large fraction of the sources however is unidentified, meaning no associated source has been discovered and those are potentially objects which consist mainly of Dark Matter. The student is supposed to find out!

AstroPart-3: Search for cosmological Dark matter signal in Fermi data [data, exp/theo]

The Fermi Gamma-ray Space Telescope (formerly called GLAST) is a satellite mission for measuring high-energy gamma-rays from space, launched in summer 2008. The project is an international collaboration, involving institutions from France, Italy, Japan, Sweden and USA. Dark Matter annihilating througout the Universe might give rise to a small contribution to an almost istropic diffuse gamma-ray flux measurable by Fermi. The student is supposed to analyse the Fermi data to extract the cosmological component and try to interprete it in terms of Dark Matter.

AstroPart-4: Optimisation of ground based Cherenkov telescope for Dark Matter Search [simulation, exp]

HESS is a system of imaging atmospheric Cerenkov telescopes for the investigation of cosmic gamma-rays in the energy range between 100 GeV and several TeV. The HESS collaboration consists of about 30 institutes from Germany, France, Great Britain, Poland, Ireland, Czech Republic ,Australia and South Africa. HESS has been operating for about 5 years and yielded many exciting discoveries. We are now in prepaation for the next generation experiment, which will be much improved. We would like to use it to discover signals of Dark Matter. The design of the instrument needs to be optimized for this purpose.

AstroPart-5: Doing cosmology with ground-based Cherenkov telescopes [simulation, data, exp]

HESS is a system of imaging atmospheric Cerenkov telescopes for the investigation of cosmic gamma-rays in the energy range between 100 GeV and several TeV. The name HESS stands for . The HESS collaboration consists of about 30 institutes from Germany, France, Great Britain, Poland, Ireland, Czech Republic ,Australia and South Africa. HESS has been operating for about 5 years and yielded many exciting discoveries. We are now in prepaation for the next generation experiment, which will be much improved. We would like to use it to discover signals of Dark Matter. The design of the instrument needs to be optimized for this purpose.

AstroPart-6: Using Artificial Neural Nets for measuring energy in Fermi [simulation, data, exp]

The Fermi Gamma-ray Space Telescope (formerly called GLAST) is a satellite mission for measuring high-energy gamma-rays from space, launched in summer 2008. The project is an international collaboration, involving institutions from France, Italy, Japan, Sweden and USA. The effective energy range is 10 keV - 300 GeV. Annihilating Dark Matter particles might reveal themselves through features in the energy spectrum of gamma-rays. In particular, a sharp line at an energy corresponding to the mass of the particle might be detected. It is therefore necessary to measure the energy as accurate as possible using as much of the detector information as possible. Advanced techniques (for example Artificial Neural Networks) might help accomplish this goal.

AstroPart-7: Search for Dark Matter sources in HESS data [data, exp]

HESS is a system of imaging atmospheric Cerenkov telescopes for the investigation of cosmic gamma-rays in the energy range between 100 GeV and several TeV. The HESS collaboration consists of about 30 institutes from Germany, France, Great Britain, Poland, Ireland, Czech Republic ,Australia and South Africa. HESS has discovered many sources which can be associated with sources known from observations at other wavelength (for example Active Galactic Nuclei). A large fraction of the sources however is unidentified, meaning no associated source has been discovered and those are potentially objects which consist mainly of Dark Matter. The student is supposed to find out!

AstroPart-8: Applied Statistics in Astroparticle Physics [simulations, exp]

Experimental and theoretical astroparticle physics requires modern statistical techniques for tatistical inference in multi-dimensional parameter spaces and estimates in the limit of low number of counts, where most approximate methods break down. Several studies on the statistical methods can be performed, which could be of interest not only to the field of astroparticle physics.

AstroPart-9: Spectral Unfolding of gamma-ray spectra [simulations, data, exp]

Spectral unfolding is a method to infer the true energy spectrum of a source from the measured one, i.e. a way to correct for the effect of measurement uncertainties, which is called the "instrument response". In astrophysics, the problem is often treated by folding a source model with the instrument response. Spectral unfolding does not require assumptions on the source. The student is expected to apply spectral unfolding to Fermi or HESS data and simulations.

AstroPart-10: Search for strange quark matter with Fermi [simulations, data, exp]

Strange Quark Matter is a proposed state of hadronic matter consisting of up, down and strange quarks. If this state is stable as proposed by various phenomenological models, there will most likely be an experimentally accessible component of strange quark matter particles (strangelets) in the cosmic ray flux. Members of our group have been suggesting ways to detect this particles using the Fermi satellite. The student is expected to have a closer look at this opportunity.

AstroPart-11: Combined searches for Dark Matter [simulations, data, exp/theo]

Dark Matter can be detected in different ways, indirectly (for example by the experiments Fermi and HESS in gamma-rays, ICECUBE in neutrinos and PAMELA in charged cosmic rays), directly through scattering of Dark Matter particles in special particle detectors and by producing possible Dark Matter particles at colliders. Most probably, it will be not sufficient to detect a Dark matter particle in one of these ways. The student is supposed to develop and apply methods to combine several experiments for Dark matter searches.

AstroPart-12: The Sommerfeld resonance in atomic physics and astroparticle physics

There has recently been excitement about a possible dark matter signal in cosmic-ray positron data. Since dark matter candidates are believed to annihilate in the galactic halo into equal amounts of particles and antiparticles, this could be the source of the observed excess. However, if one takes standard cross sections for the annihilation, conventional models seem to need a “boost” of several orders of magnitude. One possibility for such an increase of cross sections is given by the so-called Sommerfeld factor, which was introduced by Sommerfeld in atomic physics in the beginning of last century. Recently, it has been shown that this enhancement could also come into play for dark matter. The project will give a historical overview of the field, and more specifically derive, for the dark matter case, the expression for the enhancement factor for various masses and other parameters of the theory.

AstroPart-13: Mechanisms to get higher relic densities of WIMPs with non-standard cosmologies in the Early Universe

A way to get an enhancement of the annihilation cross section for dark matter candidates would be if there is something in the early universe affecting the freeze-out density of dark matter. This work will contain a critical assessment of proposed methods, and (perhaps) finding new ones that could explain present data.

AstroPart-14: Dependencies of WIMP scattering cross sections on nucleon content

One of the most promising methods searching for dark matter is via direct detection, where a dark matter particle (e.g. a WIMP) scatters on a nucleus and the recoil of the nucleus is recorded. However, the theoretical calculation of the scattering cross sections depend crucially on the nucleon quark and spin content. There is now new data on this available and the idea of this project is to implement these new data and investigate how much it affects the scattering cross sections.

AstroPart-16: Form factor dependence on WIMP capture rates in the Sun/Earth

When dark matter particles (e.g. WIMPs) scatter in the Sun or the Earth, they can lose enough energy to become gravitationally trapped. They then sink to the core of the Sun/Earth where they can annihilate and produce neutrinos. These neutrinos are then searched for with e.g. IceCube as signals from dark matter. However, a large part of the capture of WIMPs in the Sun/Earth occurs via scattering on heavy elements, where the form factors of the nuclei are of importance. The idea of this project is to calculate the capture of WIMPs in the Sun/Earth with different form factors to see how big an effect the form factors have on the total capture and hence the discovery potential for e.g. IceCube.

AstroPart-17: Implications of a thick dark matter disk on WIMP capture rates

There are indications that a fraction of the dark matter (WIMPs) in the Milky Way forms a thick disk that follows the general trends of motion of ordinary matter. This could have dramatic effects on the capture rate of WIMPs in the Sun (and Earth) as these WIMPs would move much slower compared to us than regular halo WIMPs. The idea would be to estimate the velocity distribution of these and calculate the enhancement to capture from these.

AstroPart-18: Rare decays for SUSY models

For all theories beyond the standard model, like supersymmetric theories, we get large constraints from experiments. Of particular interest is rare decays, which are suppressed in the standard model of particle physics and effects of new physics are more easily seen. There are many new calculations of some of these rare decays ( b-> s gamma, isospin assymetries etc) that put constraints on SUSY models. The idea would be to interface codes that calculate these with DarkSUSY and investigate how these constraints affect of parameter space of viable dark matter models.

AstroPart-19: Implications of massive neutrinos on SUSY models

We know that neutrinos have mass due to the observation of their oscillations and hence supersymmetric models (like the MSSM) needs to be modified to take this into account. The idea would be to go through how this can be done and investigate possible consequences for e.g. the ability of these models to explain the dark matter problem of the Universe, or observables at accelerators or from cosmic ray fluxes.

AstroPart-20: Dark matter annihilation in helium-burning stars

See here for more info.

AstroPart-21: Dark matter in the Sun and the solar oxygen crisis

See here for more info.

String Theory: (Fawad Hassan)

S-Theory-1: Generalization of notion of spacetime in the context of extra dimensions

In many theories beyond the standard model, there are extra spatial dimensions. This is true in string theory, but the idea was around long before string theory. From the original idea of one extra dimension as a circle, many researchers have gone on to consider that if there are indeed several extra dimensions, perhaps they don't conform to any particular preconceptions we have about geometry. In particular, most string models seem to not have "ordinary" manifolds for the extra dimensions, but rather so-called "non-geometric" spaces (e.g. arXiv:0709.0257). Their apparent abundance in the theory make them attractive for string phenomenology, but not much is known about their quantum properties, so calculating their partition functions would be a good project for study.

S-Theory-2: Simple models for understanding quantum effects in the very early universe

Although many people have worked on quantum field theory in curved spacetime, there is still fundamental disagreement among the experts on very basic conceptual issues, such as the meaning of a "vacuum" in time-dependent metrics. In string theory, which incorporates quantum field theory in the low-energy limit, the problem takes on a different character; there are already models (Lorentzian orbifolds, see e.g. hep-th/0310099) that describe time-dependent spacetimes quantum-mechanically, though the currently known examples are rather over-simplified to be directly relevant to actual cosmology. However, as toy models, there are several things that could be clarified by detailed studies of these models, and a brave student could well make some progress there.

S-Theory-3: Consistent generalizations of Einstein gravity of relevance to the cosmological constant problem

Observations show that more than 70 percent of the energy content of theUniverse does not have any known origin and could be understood as the"energy of the vacuum". Because the Universe is so large and mostly empty, this still corresponds to an extremely small energy density of the order of 10-29 grams/cm3. On the other hand, generic theories of particle physics predict much larger values of vacuum energy that could be around 1040 times (or even larger than) the observed value. One possible resolution of the conflict could be to modify the theory of gravity at very large distances (low energies) so that the effect of large vacuum energy is suppressed. A consistent framework for this is to regard our 3+1 dimensional Universe as a part of a higher dimensional spacetime. One can then study the resulting modified gravity theories to find out if they are consistent and could address the vacuum energy problem by computing the Green's functions and the associated cosmological solutions.

S-Theory-4: Using torus complex geometry to compute string scattering amplitudes

String theory came out of attempts for a theory for strong nuclear interactions, where one can attempt to extract an effective Lagrangian theory from the quantum-mechanical scattering matrix (S-matrix). The same kind of methods are used in perturbative string theory today, though they have come quite some distance since then. In the 1980s, it was realized that the proper framework for describing the string worldsheet is complex geometry of the torus, a subject that has been studied for centuries in mathematics, so significant progress was made for the simple (unrealistic) models available then. In the new kinds of string models, there are plenty of amplitudes that are simply not known (see e.g. hep-th/0508043), even to the first approximation. A motivated student could help develop a systematic method for computing string amplitudes using geometric methods.

General Relativity: (Kjell Rosquist)

GenRel-1: The influence of the expansion of the universe on local dynamics

The cosmological expansion may have an influence on the dynamics of local systems. Previous investigations going back to Einstein & Straus in 1945 have resulted in conflicting results. In this project you will use a new promising approach to analyze this problem. The results could be relevant for the determination of cosmological acceleration/dark energy.

GenRel-2: Gravitomagnetism in the solar system

In Newtonian physics, the gravitational ﬁeld depends only on the position of the masses, not their velocity. The electromagnetic ﬁeld, on the other hand, depends both on the position (electric ﬁeld) and the velocity of the charges (magnetic ﬁeld) as is well known. In Einstein’s theory of gravity, general relativity, there is a closer analogy between gravity and electromagnetism. In that theory, the gravitational ﬁeld has both gravitoelectric and gravitomagnetic components depending on the position and the velocity of the masses respectively. In this project you will study the gravitomagnetic ﬁelds in the solar system with the speciﬁc aim to look for possible observational effects.

GenRel-3: Gravitational and electromagnetic multipole moments in general relativity

The multipole moments represent the asymptotic structure of a field through its form at infinity, that is, far away from the system at hand. The aim of the project is to analyze by computer the multipole power series to determine the conditions for existence of a solution of the Einstein-Maxwell equations having exactly the four lowest possible moments (mass, angular momentum, electric charge and magnetic dipole) but no higher moments. One motivation for this project is to determine whether there exists a solution having exactly the four known moments of the electron but no other nonzero moments.

GenRel-4: Using the computer algebra system xTensor to construct approximate vacuum solutions of the Einstein equations

The gravitational field is governed by the Einstein equations, a complicated system of partial differential equations. It is therefore often difficult to construct realistic solutions for the gravitational field. The purpose of this project is to use computer algebra to construct approximate vacuum solutions of the Einstein equations. Such solutions are especially interesting since the major part of the universe contains practically no matter. The specific idea in this project is to use trial solutions depending on one or more arbitrary parameters and then minimize the matter/vacuum quotient. A convenient tool for this project is the new computer algebra system xTensor.

**Updated:**
2011-09-19

**Author:** Joakim Edsjö