\r\nwireless sensor nodes with limited battery power, one of the most

\r\nfundamental applications is data aggregation which collects nearby

\r\nenvironmental conditions and aggregates the data to a designated

\r\ndestination, called a sink node. Important issues concerning the

\r\ndata aggregation are time efficiency and energy consumption due

\r\nto its limited energy, and therefore, the related problem, named

\r\nMinimum Latency Aggregation Scheduling (MLAS), has been the

\r\nfocus of many researchers. Its objective is to compute the minimum

\r\nlatency schedule, that is, to compute a schedule with the minimum

\r\nnumber of timeslots, such that the sink node can receive the

\r\naggregated data from all the other nodes without any collision or

\r\ninterference. For the problem, the two interference models, the graph

\r\nmodel and the more realistic physical interference model known as

\r\nSignal-to-Interference-Noise-Ratio (SINR), have been adopted with

\r\ndifferent power models, uniform-power and non-uniform power (with

\r\npower control or without power control), and different antenna

\r\nmodels, omni-directional antenna and directional antenna models.

\r\nIn this survey article, as the problem has proven to be NP-hard,

\r\nwe present and compare several state-of-the-art approximation

\r\nalgorithms in various models on the basis of latency as its

\r\nperformance measure.", "references": null, "publisher": "World Academy of Science, Engineering and Technology", "index": "International Science Index 125, 2017" }