Performance evaluation of baseline-dependent window functions with several weighing functions
- Authors: Vanqa, Kamvulethu
- Date: 2024-04-04
- Subjects: Radio interferometers , Data compression (Computer science) , Window function , Gradient descent , Computer algorithms , Time smearing
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/435850 , vital:73206
- Description: Radio interferometric data volume is exponentially increasing with the potential to cause slow processing and data storage issues for radio observations recorded at high time and frequency resolutions. This necessitates that a sort of data compression is imposed. The conventional method to compress the data is averaging across time and frequency. However, this results in amplitude loss and source distortion at the edges of the field of view. To reduce amplitude loss and source distortion, baseline-dependent window functions (BDWFs) are proposed in theliterature. BDWFs are visibility data compression methods using window functions to retainthe signals within a field of interest (FoI) and to suppress signals outside this FoI. However,BDWFs are used with window functions as discussed in the signal processing field without any optimisation. This thesis evaluates the performance of BDWFs and then proposes to use machine learning with gradient descent to optimize the window functions employed in BDWFs. Results show that the convergence of the objective function is limited due to the band-limited nature of the window functions in the Fourier space. BDWFs performance is also investigated and discussed using several weighting schemes. Results show that there exists an optimal parameter tuning (not necessarily unique) that suggests an optimal combination of BDWFs and density sampling. With this, ∼ 4 % smearing is observed within the FoI, and ∼ 80 % source suppression is achieved outside the FoI using the MeerKAT telescope at 1.4 GHz, sampled at 1 s and 184.3 kHz then averaged with BDWFs to achieve a compression factor of 4 in time and 3 in frequency. , Thesis (MA) -- Faculty of Science, Mathematics, 2024
- Full Text:
- Date Issued: 2024-04-04
- Authors: Vanqa, Kamvulethu
- Date: 2024-04-04
- Subjects: Radio interferometers , Data compression (Computer science) , Window function , Gradient descent , Computer algorithms , Time smearing
- Language: English
- Type: Academic theses , Master's theses , text
- Identifier: http://hdl.handle.net/10962/435850 , vital:73206
- Description: Radio interferometric data volume is exponentially increasing with the potential to cause slow processing and data storage issues for radio observations recorded at high time and frequency resolutions. This necessitates that a sort of data compression is imposed. The conventional method to compress the data is averaging across time and frequency. However, this results in amplitude loss and source distortion at the edges of the field of view. To reduce amplitude loss and source distortion, baseline-dependent window functions (BDWFs) are proposed in theliterature. BDWFs are visibility data compression methods using window functions to retainthe signals within a field of interest (FoI) and to suppress signals outside this FoI. However,BDWFs are used with window functions as discussed in the signal processing field without any optimisation. This thesis evaluates the performance of BDWFs and then proposes to use machine learning with gradient descent to optimize the window functions employed in BDWFs. Results show that the convergence of the objective function is limited due to the band-limited nature of the window functions in the Fourier space. BDWFs performance is also investigated and discussed using several weighting schemes. Results show that there exists an optimal parameter tuning (not necessarily unique) that suggests an optimal combination of BDWFs and density sampling. With this, ∼ 4 % smearing is observed within the FoI, and ∼ 80 % source suppression is achieved outside the FoI using the MeerKAT telescope at 1.4 GHz, sampled at 1 s and 184.3 kHz then averaged with BDWFs to achieve a compression factor of 4 in time and 3 in frequency. , Thesis (MA) -- Faculty of Science, Mathematics, 2024
- Full Text:
- Date Issued: 2024-04-04
Data compression, field of interest shaping and fast algorithms for direction-dependent deconvolution in radio interferometry
- Authors: Atemkeng, Marcellin T
- Date: 2017
- Subjects: Radio astronomy , Solar radio emission , Radio interferometers , Signal processing -- Digital techniques , Algorithms , Data compression (Computer science)
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/6324 , vital:21089
- Description: In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities “decorrelate”, and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as “smearing”, which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. Averaging also results in baseline length and a position-dependent point spread function (PSF). In this work, we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be understood as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. Specifically, we can improve amplitude response over a chosen field of interest and attenuate sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Jansky Very Large Array and the European Very Long Baseline Interferometry Network. Furthermore, we show that the position-dependent PSF shape induced by averaging can be approximated using linear algebraic properties to effectively reduce the computational complexity for evaluating the PSF at each sky position. We conclude by implementing a position-dependent PSF deconvolution in an imaging and deconvolution framework. Using the Low-Frequency Array radio interferometer, we show that deconvolution with position-dependent PSFs results in higher image fidelity compared to a simple CLEAN algorithm and its derivatives.
- Full Text:
- Date Issued: 2017
- Authors: Atemkeng, Marcellin T
- Date: 2017
- Subjects: Radio astronomy , Solar radio emission , Radio interferometers , Signal processing -- Digital techniques , Algorithms , Data compression (Computer science)
- Language: English
- Type: Thesis , Doctoral , PhD
- Identifier: http://hdl.handle.net/10962/6324 , vital:21089
- Description: In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities “decorrelate”, and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as “smearing”, which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. Averaging also results in baseline length and a position-dependent point spread function (PSF). In this work, we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be understood as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. Specifically, we can improve amplitude response over a chosen field of interest and attenuate sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Jansky Very Large Array and the European Very Long Baseline Interferometry Network. Furthermore, we show that the position-dependent PSF shape induced by averaging can be approximated using linear algebraic properties to effectively reduce the computational complexity for evaluating the PSF at each sky position. We conclude by implementing a position-dependent PSF deconvolution in an imaging and deconvolution framework. Using the Low-Frequency Array radio interferometer, we show that deconvolution with position-dependent PSFs results in higher image fidelity compared to a simple CLEAN algorithm and its derivatives.
- Full Text:
- Date Issued: 2017
- «
- ‹
- 1
- ›
- »