{"id":606534,"date":"2025-07-09T14:06:11","date_gmt":"2025-07-09T19:06:11","guid":{"rendered":"https:\/\/towardsdatascience.com\/?p=606534"},"modified":"2025-07-09T14:06:26","modified_gmt":"2025-07-09T19:06:26","slug":"time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing","status":"publish","type":"post","link":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/","title":{"rendered":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition"},"content":{"rendered":"\n<p class=\"wp-block-paragraph\"><mdspan datatext=\"el1752023920619\" class=\"mdspan-comment\">In the <a href=\"https:\/\/towardsdatascience.com\/author\/nikhil-dasari\/\">first two parts of this series<\/a><\/mdspan>, we explored trend, seasonality, and residuals using temperature data as our example. We started by uncovering patterns in the data with Python\u2019s <code>seasonal_decompose<\/code> method. Next, we made our first temperature forecasts using standard baseline models like the seasonal naive.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">From there, we went deeper and learned how <code>seasonal_decompose<\/code> actually computes the trend, seasonality and residual components. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We extracted those pieces to build a decomposition-based baseline model and then experimented with custom baselines tailored to our data. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Finally, we evaluated each model using Mean Absolute Percentage Error (MAPE) to see how well our approaches performed.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">In the first two parts, we worked with temperature data, a relatively simple dataset where the trend and seasonality were clear and <code>seasonal_decompose<\/code> did a good job of capturing those patterns. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">However, in many real-world datasets, things aren\u2019t always so straightforward. Trends and seasonal patterns can shift or get messy, and in these cases, <code>seasonal_decompose<\/code> may not capture the underlying structure as effectively.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This is where we turn to a more advanced decomposition method to better understand the data: <strong>STL \u2014 Seasonal-Trend decomposition using LOESS.<\/strong><\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>LOESS<\/strong> stands for <strong>Locally Estimated Scatterplot Smoothing<\/strong>.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To better understand this in action, we\u2019ll use the <em><a href=\"https:\/\/fred.stlouisfed.org\/series\/RSDSELD\">Retail Sales of Department Stores<\/a><\/em> dataset from FRED (Federal Reserve Economic Data).<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Here\u2019s what the data looks like:<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/06\/time-series-blog-part-3-stl.png\" alt=\"\" class=\"wp-image-606829\" style=\"width:476px;height:auto\"\/><figcaption class=\"wp-element-caption\"><strong>Sample of the Retail Sales of Department Stores dataset from FRED.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">The dataset we\u2019re working with tracks monthly retail sales from U.S. department stores, and it comes from the trusted FRED (Federal Reserve Economic Data) source.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">It has just two columns:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li class=\"wp-block-list-item\"><strong><code>Observation_Date<\/code><\/strong> \u2013 the beginning of each month<\/li>\n\n\n\n<li class=\"wp-block-list-item\"><strong><code>Retail_Sales<\/code><\/strong> \u2013 total sales for that month, in <strong>millions of dollars<\/strong><\/li>\n<\/ul>\n\n\n\n<p class=\"wp-block-paragraph\">The time series runs from <strong>January 1992 all the way to March 2025<\/strong>, giving us over 30 years of sales data to explore.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Note:<\/strong> Even though each date marks the start of the month (like <code>01-01-1992<\/code>), the sales value represents the total sales for the entire month.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But before jumping into STL, we will run the classic <code>seasonal_decompose<\/code> on our dataset and take a look at what it shows us.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\nimport matplotlib.pyplot as plt\nfrom statsmodels.tsa.seasonal import seasonal_decompose\n\n# Load the dataset\ndf = pd.read_csv(&quot;C:\/RSDSELDN.csv&quot;, parse_dates=[&#039;Observation_Date&#039;], dayfirst=True)\n\n# Set the date column as index\ndf.set_index(&#039;Observation_Date&#039;, inplace=True)\n\n# Set monthly frequency\ndf = df.asfreq(&#039;MS&#039;)  # MS = Month Start\n\n# Extract the series\nseries = df[&#039;Retail_Sales&#039;]\n\n# Apply classical seasonal decomposition\nresult = seasonal_decompose(series, model=&#039;additive&#039;, period=12)\n\n# Plot with custom colors\nfig, axs = plt.subplots(4, 1, figsize=(12, 8), sharex=True)\n\naxs[0].plot(result.observed, color=&#039;olive&#039;)\naxs[0].set_title(&#039;Observed&#039;)\n\naxs[1].plot(result.trend, color=&#039;darkslateblue&#039;)\naxs[1].set_title(&#039;Trend&#039;)\n\naxs[2].plot(result.seasonal, color=&#039;darkcyan&#039;)\naxs[2].set_title(&#039;Seasonal&#039;)\n\naxs[3].plot(result.resid, color=&#039;peru&#039;)\naxs[3].set_title(&#039;Residual&#039;)\n\nplt.suptitle(&#039;Classical Seasonal Decomposition (Additive)&#039;, fontsize=16)\nplt.tight_layout()\nplt.show()\n<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Plot:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/06\/class-decomp-plot-part-3-tim-ser-1024x605.png\" alt=\"\" class=\"wp-image-606836\"\/><figcaption class=\"wp-element-caption\"><strong>Classical Seasonal Decomposition (Additive) of monthly retail sales.<\/strong><br>The observed series shows a gradual decline in overall sales. However, the seasonal component remains fixed across time \u2014 a limitation of classical decomposition, which assumes that seasonal patterns do not change, even when real-world behavior evolves.<\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">In Part 2, we explored how <code>seasonal_decompose<\/code> computes trend and seasonal components under the assumption of a fixed, repeating seasonal structure.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">However, real-world data doesn\u2019t always follow a fixed pattern. Trends may change gradually and seasonal behaviors can vary year to year. This is why we need a more adaptable approach, and STL decomposition offers exactly that.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We will apply STL decomposition to the data to observe how it handles shifting trends and seasonality.<\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\nimport matplotlib.pyplot as plt\nfrom statsmodels.tsa.seasonal import STL\n\n# Load the dataset\ndf = pd.read_csv(&quot;C:\/RSDSELDN.csv&quot;, parse_dates=[&#039;Observation_Date&#039;], dayfirst=True)\ndf.set_index(&#039;Observation_Date&#039;, inplace=True)\ndf = df.asfreq(&#039;MS&#039;)  # Ensure monthly frequency\n\n# Extract the time series\nseries = df[&#039;Retail_Sales&#039;]\n\n# Apply STL decomposition\nstl = STL(series, seasonal=13)\nresult = stl.fit()\n\n# Plot and save STL components\nfig, axs = plt.subplots(4, 1, figsize=(10, 8), sharex=True)\n\naxs[0].plot(result.observed, color=&#039;sienna&#039;)\naxs[0].set_title(&#039;Observed&#039;)\n\naxs[1].plot(result.trend, color=&#039;goldenrod&#039;)\naxs[1].set_title(&#039;Trend&#039;)\n\naxs[2].plot(result.seasonal, color=&#039;darkslategrey&#039;)\naxs[2].set_title(&#039;Seasonal&#039;)\n\naxs[3].plot(result.resid, color=&#039;rebeccapurple&#039;)\naxs[3].set_title(&#039;Residual&#039;)\n\nplt.suptitle(&#039;STL Decomposition of Retail Sales&#039;, fontsize=16)\nplt.tight_layout()\n\nplt.show()\n<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Plot:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/06\/stl-decomp-plot-part-3-blog-1024x534.png\" alt=\"\" class=\"wp-image-606910\"\/><figcaption class=\"wp-element-caption\"><strong>STL Decomposition of Retail Sales Data.<\/strong><br>Unlike classical decomposition, STL allows the seasonal component to change gradually over time. This flexibility makes STL a better fit for real-world data where patterns evolve, as seen in the adaptive seasonal curve and cleaner residuals.<\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">After completing that step, got a feel for what STL does, we will dive into how it figures out the trend and seasonal patterns behind the scenes.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To better understand how STL decomposition works, we will consider a sample from our dataset spanning from January 2010 to December 2023.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/06\/stl-sample-data.png\" alt=\"\" class=\"wp-image-607064\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Sample of monthly retail sales data from January 2010 to December 2023 used to demonstrate STL decomposition.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">To understand how STL decomposition works on this data, we first need rough estimates of the trend and seasonality. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Since STL is a smoothing-based technique, it requires an initial idea of what should be smoothed, such as where the trend lies and how the seasonal patterns behave.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We will begin by visualizing the retail\u2010sales series (Jan 2010\u2013Dec 2023) and use Python\u2019s STL routine to extract its trend, seasonal, and remainder parts.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\nimport matplotlib.pyplot as plt\nfrom statsmodels.tsa.seasonal import STL\n\n# Load the dataset\ndf = pd.read_csv(&quot;C:\/STL sample data.csv&quot;, parse_dates=[&#039;Observation_Date&#039;], dayfirst=True)\ndf.set_index(&#039;Observation_Date&#039;, inplace=True)\ndf = df.asfreq(&#039;MS&#039;)  # Ensure monthly frequency\n\n# Extract the time series\nseries = df[&#039;Retail_Sales&#039;]\n\n# Apply STL decomposition\nstl = STL(series, seasonal=13)\nresult = stl.fit()\n\n# Plot and save STL components\nfig, axs = plt.subplots(4, 1, figsize=(10, 8), sharex=True)\n\naxs[0].plot(result.observed, color=&#039;sienna&#039;)\naxs[0].set_title(&#039;Observed&#039;)\n\naxs[1].plot(result.trend, color=&#039;goldenrod&#039;)\naxs[1].set_title(&#039;Trend&#039;)\n\naxs[2].plot(result.seasonal, color=&#039;darkslategrey&#039;)\naxs[2].set_title(&#039;Seasonal&#039;)\n\naxs[3].plot(result.resid, color=&#039;rebeccapurple&#039;)\naxs[3].set_title(&#039;Residual&#039;)\n\nplt.suptitle(&#039;STL Decomposition of Retail Sales(2010-2023)&#039;, fontsize=16)\nplt.tight_layout()\nplt.show()\n<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Plot:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/STL_decomposition_plot1.png\" alt=\"\" class=\"wp-image-607241\"\/><figcaption class=\"wp-element-caption\"><strong>STL Decomposition of Retail Sales (2010\u20132023)<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">To understand how STL derives its components, we first estimate the data\u2019s long-term trend using a centered moving average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We will use a single-month example to demonstrate how to calculate a centered moving average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We will calculate the centered moving average for <strong>July 2010<\/strong>.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/trend-for-july-2010-data-table.png\" alt=\"\" class=\"wp-image-607248\"\/><figcaption class=\"wp-element-caption\"><strong>Monthly Retail Sales from Jan 2010 to Jan 2011<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Because our data is monthly, the natural cycle covers twelve points, which is an even number. Averaging January 2010 through December 2010 produces a value that falls halfway between June and July. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To adjust for this, we form a second window from February 2010 through January 2011, whose twelve-month mean lies halfway between July and August. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We then compute each window\u2019s simple average and average those two results. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">In the first window July is the seventh of twelve points, so the mean lands between months six and seven. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">In the second window July is the sixth of twelve points, so its mean also falls between months six and seven but shifted forward. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Averaging both estimates pulls the result back onto July 2010 itself, yielding a true centered moving average for that month.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/trend-for-july-2010-data-table-3.png\" alt=\"\" class=\"wp-image-607254\"\/><figcaption class=\"wp-element-caption\"><strong>Computation of the Two 12-Month Averages for July 2010<\/strong><\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/trend-for-july-2010-data-table-4.png\" alt=\"\" class=\"wp-image-607255\"\/><figcaption class=\"wp-element-caption\"><strong>Centering the Moving Average on July 2010<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">This is how we compute the initial trend using a centered moving average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">At the very start and end of our series, we simply don\u2019t have six months on both sides to average\u2014so there\u2019s no \u201cnatural\u201d centered MA for Jan\u2013Jun 2010 or for Jul\u2013Dec 2023. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Rather than drop those points, we carry the first real July 2010 value backwards to fill Jan\u2013Jun, and carry our last valid December 2023 value forward to fill Jul\u2013Dec 2023. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">That way, every month has a baseline trend before we move on to the LOESS refinements.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Next, we will use Python to compute the initial trend for each month.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\n\n# Load and prepare the data\ndf = pd.read_csv(&quot;C:\/STL sample data for part 3.csv&quot;,\n                 parse_dates=[&quot;Observation_Date&quot;], dayfirst=True,\n                 index_col=&quot;Observation_Date&quot;)\ndf = df.asfreq(&quot;MS&quot;)  # ensure a continuous monthly index\n\n# Extract the series\nsales = df[&quot;Retail_Sales&quot;]\n\n# Compute the two 12-month moving averages\nn = 12\nma1 = sales.rolling(window=n, center=False).mean().shift(-n\/\/2 + 1)\nma2 = sales.rolling(window=n, center=False).mean().shift(-n\/\/2)\n\n# Center them by averaging\nT0 = (ma1 + ma2) \/ 2\n\n# Fill the edges so every month has a value\nT0 = T0.fillna(method=&quot;bfill&quot;).fillna(method=&quot;ffill&quot;)\n\n# Attach to the DataFrame\ndf[&quot;Initial_Trend&quot;] = T0<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\"> <strong>Table:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/trend-for-july-2010-data-table-5.png\" alt=\"\" class=\"wp-image-607261\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Retail Sales with Initial Centered Moving-Average Trend<\/strong><\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-text-primary-color has-alpha-channel-opacity has-text-primary-background-color has-background is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">We have extracted the initial trend using a centered moving average, let\u2019s see how it actually looks. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We will plot it along with the original time series and STL\u2019s final trend line to compare how each one captures the overall movement in the data.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Plot:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/obs-vs-ini-vs-stl-trend-1-1024x481.png\" alt=\"\" class=\"wp-image-607334\"\/><figcaption class=\"wp-element-caption\"><strong>Observed Sales vs. Initial 12-month Moving Average Trend vs. Final STL Trend<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Looking at the plot, we can see that the trend line from the moving average almost overlaps with the STL trend for most of the years. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But around Jan\u2013Feb 2020, there&#8217;s a sharp dip in the moving average line. This drop was due to the sudden impact of COVID on sales. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">STL handles this better, it doesn\u2019t treat it as a long-term trend change but instead marks it as a residual. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">That\u2019s because STL sees this as a one-time unexpected event, not a repeating seasonal pattern or a shift in the overall trend.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To understand how STL does this and how it handles changing seasonality, let\u2019s continue building our understanding step by step. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We now have the initial trend using moving averages, so let\u2019s move on to the next step in the STL process.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">Next, we subtract our centered MA trend from the original sales to obtain the detrended series.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/detrended-blog-part-3.png\" alt=\"\" class=\"wp-image-607344\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Actual Sales, Initial MA Trend and Detrended Values<\/strong><\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">We have removed the long-term trend from our data, so the remaining series shows just the repeating seasonal swings and random noise. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Let\u2019s plot it to see the regular ups and downs and any unexpected spikes or dips.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/detrended-series-1024x314.png\" alt=\"\" class=\"wp-image-607445\"\/><figcaption class=\"wp-element-caption\"><strong>Detrended Series Showing Seasonal Patterns and Irregular Spikes\/Dips<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">The above plot shows what remains after we remove the long-term trend. You can see the familiar annual rise and fall and that deep drop in January 2020 when COVID hit. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">When we average all the January values, including the 2020 crash, that single event blends in and hardly affects the January average. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This helps us ignore rare shocks and focus on the true seasonal pattern. Now we will group the detrended values by month and take their averages to create our first seasonal estimate.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This gives us a stable estimate of seasonality, which STL will then refine and smooth in later iterations to capture any gradual shifts over time.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">Next, we will repeat our seasonal-decompose approach: we\u2019ll group the detrended values by calendar month to extract the raw monthly seasonal offsets.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Let\u2019s focus on January and gather all the detrended values for that month.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/detrended-blog-part-3-1.png\" alt=\"\" class=\"wp-image-607345\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Detrended January Values (2010\u20132023)<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Now, we compute the <strong>average of the detrended values for January<\/strong> across all years to obtain a rough seasonal estimate for that month.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/CodeCogsEqn-43.png\" alt=\"\" class=\"wp-image-607426\"\/><figcaption class=\"wp-element-caption\"><strong>Calculating the average of January\u2019s detrended values across 12 years to obtain the seasonal estimate for January.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">This process is repeated for all 12 months to form the initial seasonal component.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/detrended-by-nonth.png\" alt=\"\" class=\"wp-image-607428\"\/><figcaption class=\"wp-element-caption\"><strong><strong>Table:<\/strong> Monthly average of detrended values, forming the seasonal estimate for each month.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Now we have the average detrended values for each month, we map them across the entire time series to construct the initial seasonal component.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/detrended-by-nonth-1.png\" alt=\"\" class=\"wp-image-607429\"\/><figcaption class=\"wp-element-caption\"><strong><strong>Table:<\/strong> Detrended values and their monthly averages used for estimating the seasonal pattern.<\/strong><\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">After grouping the detrended values by month and calculating their averages, we obtain a new series of monthly means. Let\u2019s plot this series to observe how the data look after applying this averaging step.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/seasonal-estimate-monthly-averages-1024x306.png\" alt=\"\" class=\"wp-image-607446\"\/><figcaption class=\"wp-element-caption\"><strong>seasonal estimate by repeating monthly averages.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">In the above plot, we grouped the detrended values by month and took the average for each one. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This helped us reduce the effect of that big dip in January 2020, which was likely due to the COVID pandemic. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">By averaging all the January values together, that sudden drop gets blended in with the rest, giving us a more stable picture of how January usually behaves each year.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">However, if we look closely, we can still see some sudden spikes and dips in the line. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">These might be caused by things like special promotions, strikes or unexpected holidays that don\u2019t happen every year. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Since seasonality is meant to capture patterns that repeat regularly each year, we don\u2019t want these irregular events to stay in the seasonal curve.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But how do we know those spikes or dips are just one-off events and not real seasonal patterns? It comes down to how often they happen. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">A big spike in December shows up because every December has high sales, so the December average stays high year after year. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We see a small increase in March, but that\u2019s mostly because one or two years were unusually strong.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The average for March doesn\u2019t really shift much. When a pattern shows up almost every year in the same month, that\u2019s seasonality. If it only happens once or twice, it\u2019s probably just an irregular event.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To handle this, we use a low-pass filter. While averaging helps us get a rough idea of seasonality, the low-pass filter goes one step further.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"> It smooths out those remaining small spikes and dips so that we are left with a clean seasonal pattern that reflects the general rhythm of the year. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This smooth seasonal curve will then be used in the next steps of the STL process.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">Next, we will smooth out that rough seasonal curve by running a low-pass filter over every point in our monthly-average series.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To apply the low-pass filter, we start by computing a centered 13-month moving average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">For example, consider September 2010. The 13-month average at this point (from March 2010 to March 2011) would be:<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/CodeCogsEqn-44-1024x116.png\" alt=\"\" class=\"wp-image-607490\"\/><figcaption class=\"wp-element-caption\"><strong>13-Month Average Example for September 2010 using surrounding monthly seasonal values<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">We repeat this 13-month averaging for every point in our monthly average series. Because the pattern repeats every year, the value for September 2010 will be the same as for September 2011. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">For the first and last six months, we don\u2019t have enough data to take a full 13-month average, so we just use whatever months are available around them.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Let\u2019s take a look at the averaging windows used for the months where a full 13-month average isn\u2019t possible.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/Averaging-Window-Used.png\" alt=\"\" class=\"wp-image-607491\"\/><figcaption class=\"wp-element-caption\"><strong>Table:<\/strong> <strong>Averaging windows used for the first and last six months, where a full 13-month average was not possible.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Now we\u2019ll use Python to calculate the 13-month average values<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\n\n# Load the seasonal estimate series\ndf = pd.read_csv(&quot;C:\/stl_with_monthly_avg.csv&quot;, parse_dates=[&#039;Observation_Date&#039;], dayfirst=True)\n\n# Apply 13-month centered moving average on the Avg_Detrended_by_Month column\n# Handle the first and last 6 values with partial windows\nseasonal_estimate = df[[&#039;Observation_Date&#039;, &#039;Avg_Detrended_by_Month&#039;]].copy()\nlpf_values = []\n\nfor i in range(len(seasonal_estimate)):\n    start = max(0, i - 6)\n    end = min(len(seasonal_estimate), i + 7)  # non-inclusive\n    window_avg = seasonal_estimate[&#039;Avg_Detrended_by_Month&#039;].iloc[start:end].mean()\n    lpf_values.append(window_avg)\n\n# Add the result to DataFrame\nseasonal_estimate[&#039;LPF_13_Month&#039;] = lpf_values<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\">With this code, we get the 13-month moving average for the full time series.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/Averaging-Window-Used-1.png\" alt=\"\" class=\"wp-image-607492\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Monthly detrended values along with their smoothed 13-month averages.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">After completing the first step of applying the low-pass filter by calculating the 13-month averages, the next step is to smooth those results further using a 3-point moving average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Now, let\u2019s see how the 3-point average is calculated for September 2010.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/CodeCogsEqn-45.png\" alt=\"\" class=\"wp-image-607493\"\/><figcaption class=\"wp-element-caption\"><strong>Step-by-step calculation of the 3-point moving average for September 2010 as part of the low-pass filtering process.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">For January 2010, we calculate the average using January and February values, and for December 2023, we use December and November. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This approach is used for the endpoints where a full 3-month window isn\u2019t available. In this way, we compute the 3-point moving average for each data point in the series.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Now, we use Python again to calculate the 3-month window averages for our data.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\n\n# Load CSV file\ndf = pd.read_csv(&quot;C:\/seasonal_13month_avg3.csv&quot;, parse_dates=[&#039;Observation_Date&#039;], dayfirst=True)\n\n\n# Calculate the 3-point moving average\nlpf_values = df[&#039;LPF_13_Month&#039;].values\nmoving_avg_3 = []\n\nfor i in range(len(lpf_values)):\n    if i == 0:\n        avg = (lpf_values[i] + lpf_values[i + 1]) \/ 2\n    elif i == len(lpf_values) - 1:\n        avg = (lpf_values[i - 1] + lpf_values[i]) \/ 2\n    else:\n        avg = (lpf_values[i - 1] + lpf_values[i] + lpf_values[i + 1]) \/ 3\n    moving_avg_3.append(avg)\n\n# Add the result to a new column\ndf[&#039;LPF_13_3&#039;] = moving_avg_3<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\">Using the code above, we get the 3-month moving average values.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/first-3-month-average.png\" alt=\"\" class=\"wp-image-607494\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Applying the second step of the low-pass filter: 3-month averages on the 13-month smoothed values.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">We\u2019ve calculated the 3-month averages on the 13-month smoothed values. Next, we\u2019ll apply another 3-month moving average to further refine the series.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\n\n# Load the dataset\ndf = pd.read_csv(&quot;C:\/5seasonal_lpf_13_3_1.csv&quot;)\n\n# Apply 3-month moving average on the existing LPF_13_3 column\nlpf_column = &#039;LPF_13_3&#039;\nsmoothed_column = &#039;LPF_13_3_3&#039;\n\nsmoothed_values = []\nfor i in range(len(df)):\n    if i == 0:\n        avg = df[lpf_column].iloc[i:i+2].mean()\n    elif i == len(df) - 1:\n        avg = df[lpf_column].iloc[i-1:i+1].mean()\n    else:\n        avg = df[lpf_column].iloc[i-1:i+2].mean()\n    smoothed_values.append(avg)\n\n# Add the new smoothed column to the DataFrame\ndf[smoothed_column] = smoothed_values<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\">From the above code, we have now calculated the 3-month averages once again.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/first-3-month-average-1.png\" alt=\"\" class=\"wp-image-607496\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Final step of the low-pass filter: Second 3-month moving average applied on previously smoothed values to reduce noise and stabilize seasonal pattern.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">With all three levels of smoothing complete, the next step is to calculate a weighted average at each point to obtain the final low-pass filtered seasonal curve.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">It&#8217;s like taking an average, but a smarter one. We use three versions of the seasonal pattern, each smoothed to a different level. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We create three smoothed versions of the seasonal pattern, each one smoother than the last. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The first is a simple 13-month moving average, which applies light smoothing. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The second takes this result and applies a 3-month moving average, making it smoother. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The third repeats this step, resulting in the most stable version. Since the third one is the most reliable, we give it the most weight. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The first version still contributes a little, and the second plays a moderate role. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">By combining them with weights of 1, 3, and 9, we calculate a weighted average that gives us the final seasonal estimate.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This result is smooth and steady, yet flexible enough to follow real changes in the data.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Here\u2019s how we calculate the weighted average at each point.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">For example, let\u2019s take September 2010.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/CodeCogsEqn-46.png\" alt=\"\" class=\"wp-image-607503\"\/><figcaption class=\"wp-element-caption\"><strong>Final LPF calculation for September 2010 using weighted smoothing. The three smoothed values are combined using weights 1, 3, and 9, then averaged to get the final seasonal estimate.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">We divide by 23 to apply an additional shrink factor and ensure the weighted average stays on the same scale.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Code:<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-prismatic-blocks\"><code class=\"language-python\">import pandas as pd\n\n# Load the dataset\ndf = pd.read_csv(&quot;C:\/7seasonal_lpf_13_3_2.csv&quot;)\n\n# Calculate the weighted average using 1:3:9 across LPF_13_Month, LPF_13_3, and LPF_13_3_2\ndf[&quot;Final_LPF&quot;] = (\n    1 * df[&quot;LPF_13_Month&quot;] +\n    3 * df[&quot;LPF_13_3&quot;] +\n    9 * df[&quot;LPF_13_3_2&quot;]\n) \/ 23<\/code><\/pre>\n\n\n\n<p class=\"wp-block-paragraph\">By using the code above, we calculate the weighted average at each point in the series.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/final-lpf-values.png\" alt=\"\" class=\"wp-image-607507\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Final LPF values at each time point, computed using weighted smoothing with 1:3:9 weights. The last column shows the final seasonal estimate, derived from three levels of low-pass filtering.<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">These final smoothed values represent the seasonal pattern in the data. They highlight the recurring monthly fluctuations, free from random noise or outliers, and provide a clearer view of the underlying seasonal rhythms over time.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">But before moving to the next step, it&#8217;s important to understand why we used a 13-month average followed by two rounds of 3-month averaging as part of the low-pass filtering process.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">First, we calculated the average of detrended values by grouping them by month. This gave us a rough idea of the seasonal pattern. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But as we saw earlier, this pattern still has some random spikes and dips. Since we\u2019re working with monthly data, it might seem like using a 12-month average would make sense. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But STL actually uses a 13-month average. That\u2019s because 12 is an even number, so the average isn\u2019t centered on a single month \u2014 it falls between two months. This can slightly shift the pattern. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Using 13, which is an odd number, keeps the smoothing centered right on each month. It helps us smooth out the noise while keeping the true seasonal pattern in place. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Let\u2019s take a look at how the 13-month average transforms the series with the help of a plot.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/13-month-plot-1024x508.png\" alt=\"\" class=\"wp-image-607511\"\/><figcaption class=\"wp-element-caption\"><strong>Smoothing Monthly Averages Using a 13-Month Moving Average<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">The orange line, representing the 13-month average, smooths the sharp fluctuations seen in the raw monthly averages (blue), helping to expose a clearer and more consistent seasonal pattern by filtering out random noise.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">You might notice that the peaks in the orange line don\u2019t perfectly line up with the blue ones anymore. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">For example, a spike that appeared in December before might now show up slightly earlier or later. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This happens because the 13-month average looks at the surrounding values, which can shift the curve a little to the side.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This shifting is a normal effect of moving averages. To fix it, the next step is centering. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We group the smoothed values by calendar month and putting all the January values together and so on and then take the average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This brings the seasonal pattern back into alignment with the correct month, so it reflects the real timing of the seasonality in the data.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">After smoothing the pattern with a 13-month average, the curve looks much cleaner, but it can still have small spikes and dips. To smooth it a little more, we use a 3-month average. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But why <strong>3<\/strong> and not something bigger like 5 or 6. A 3-month window works well because it smooths gently without making the curve too flat. If we use a larger window, we might lose the natural shape of the seasonality. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Using a smaller window like 3, and applying it twice, gives a nice balance between cleaning the noise and keeping the real pattern.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Now let\u2019s see what this looks like on a plot.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/13-3-3-plots-1024x508.png\" alt=\"\" class=\"wp-image-607520\"\/><figcaption class=\"wp-element-caption\"><strong>Progressive Smoothing of Seasonal Pattern \u2014 Starting with a 13-Month Average and Applying Two 3-Month Averages for Refinement<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">This plot shows how our rough seasonal estimate becomes smoother in steps. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The blue line is the result of the 13-month average, which already softens out many of the random spikes. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Then we apply a 3-month average once (orange line) and again (green line). Each step smooths the curve a bit more, especially removing tiny bumps and jagged noise. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">By the end, we get a clean seasonal shape that still follows the repeating pattern but is much more stable and easier to work with for forecasting.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We now have three versions of the seasonal pattern: one slightly rough, one moderately smooth and one very smooth. It might seem like we could simply choose the smoothest one and move on.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"> After all, seasonality repeats every year, so the cleanest curve should be enough. But in real-world data, seasonal behavior is rarely that perfect. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">December spikes might show up a little earlier in some years, or their size might vary depending on other factors. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The rough version captures these small shifts, but it also carries noise. The smoothest one removes the noise but can miss those subtle variations. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">That\u2019s why STL blends all three. It gives more weight to the smoothest version because it is the most stable, but it also keeps some influence from the medium and rougher ones to retain flexibility. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This way, the final seasonal curve is clean and reliable, yet still responsive to natural changes. As a result, the trend we extract in later steps stays true and does not absorb leftover seasonal effects.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We use weights of 1, 3, and 9 when blending the three seasonal curves because each version gives us a different perspective. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The roughest version picks up small shifts and short-term changes but also includes a lot of noise. The medium version balances detail and stability, while the smoothest version gives a clean and steady seasonal shape that we can trust the most. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">That is why we give the smoothest one the highest weight. These specific weights are recommended in the original STL paper because they work well in most real-world cases. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We might wonder why not use something like 1, 4, and 16 instead. While that would give even more importance to the smoothest curve, it could also make the seasonal pattern too rigid and less responsive to natural shifts in timing or intensity. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Real-life seasonality is not always perfect. A spike that usually happens in December might come earlier in some years. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The 1, 3, 9 combination helps us stay flexible while still keeping things smooth. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">After blending the three seasonal curves using weights of 1, 3, and 9, we might expect to divide the result by 13, the sum of the weights as we would in a regular weighted average. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">But here we divide it by 23 (13+10) instead. This scaling factor gently shrinks the seasonal values, especially at the edges of the series where estimates tend to be less stable.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">It also helps keep the seasonal pattern reasonably scaled, so it doesn&#8217;t overpower the trend or distort the overall structure of the time series.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The result is a seasonal curve that is smooth, adaptive, and does not interfere with the trend.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Now let\u2019s plot the final low-pass filtered values that we obtained by calculating the weighted averages.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/final-lpf-1024x508.png\" alt=\"\" class=\"wp-image-607523\"\/><figcaption class=\"wp-element-caption\"><strong><strong>Final Low-Pass Filtered Seasonal Component<\/strong><\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">This plot shows the final seasonal pattern we got by blending three smoothed versions using weights 1, 3, and 9. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The result keeps the repeating monthly pattern clear while reducing random spikes. It\u2019s now ready to be centered and subtracted from the data to find the trend.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">The final low-pass filtered seasonal component is ready. The next step is to center it to ensure the seasonal effects average to zero over each cycle.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We center the seasonal values by making their average (mean) zero. This is important because the seasonal part should only show repeating patterns, like regular ups and downs each year, not any overall increase or decrease. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">If the average isn\u2019t zero, the seasonal part might wrongly include part of the trend. By setting the mean to zero, we make sure the trend shows the long-term movement, and the seasonal part just shows the repeating changes.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">To perform the centering, we first group the final low-pass filter seasonal components by month and then calculate the average.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">After calculating the average, we subtract it from the actual final low-pass filtered value. This gives us the centered seasonal component, completing the centering step.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Let\u2019s walk through how centering is done for a single data point.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">For Sep 2010<br>Final LPF value (September 2010) = \u221271.30<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Monthly average of all September LPF values = \u221248.24<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Centered seasonal value = Final LPF \u2013 Monthly Average <br>= \u221271.30\u2212(\u221248.24) = \u221223.06<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">In this way, we calculate the centered seasonal component for each data point in the series.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/final-lpf-values-1.png\" alt=\"\" class=\"wp-image-607531\"\/><figcaption class=\"wp-element-caption\"><strong>Table: Centering the Final Low-Pass Filtered Seasonal Values<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">Now we\u2019ll plot these values to see how the centered seasonality curve looks.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Plot:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img decoding=\"async\" src=\"https:\/\/contributor.insightmediagroup.io\/wp-content\/uploads\/2025\/07\/centered-seasonal-component.png\" alt=\"\" class=\"wp-image-607532\"\/><figcaption class=\"wp-element-caption\"><strong>Centered Seasonal Component vs. Monthly Detrended Averages<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"wp-block-paragraph\">The above plot compares the <strong>monthly average of detrended values<\/strong> (blue line) with the <strong>centered seasonal component<\/strong> (orange line) obtained after low-pass filtering and centering.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">We can observe that the orange curve is much smoother and cleaner, capturing the <strong>repeating seasonal pattern<\/strong> without any long-term drift. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This is because we\u2019ve centered the seasonal component by subtracting the monthly average, ensuring its mean is zero across each cycle.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Importantly, we can also see that the <strong>spikes in the seasonal pattern now align with their original positions<\/strong>. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">The peaks and dips in the orange line match the timing of the blue spikes, showing that the seasonal effect has been properly estimated and re-aligned with the data.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dotted\"\/>\n\n\n\n<p class=\"wp-block-paragraph\">In this part, we discussed how to calculate the initial trend and seasonality in the STL process. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">These initial components are essential because STL is a smoothing-based decomposition method, and it needs a structured starting point to work effectively.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Without an initial estimate of the trend and seasonality, applying LOESS directly to the raw data could lead to the smoothing of noise and residuals or even fitting patterns to random fluctuations. This would result in unreliable or misleading components.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">That\u2019s why we first extract a rough trend using moving averages and then isolate seasonality using a low-pass filter. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">These provide STL with a reasonable approximation to begin its iterative refinement process, which we will explore in the <strong>next part<\/strong>.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">In the <strong>next part<\/strong>, we begin by <strong>deseasonalizing the original series<\/strong> using the centered seasonal component. Then, we apply <strong>LOESS smoothing<\/strong> to the deseasonalized data to obtain an updated trend.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">This marks the starting point of the <strong>iterative refinement process<\/strong> in STL.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Note: All images, unless otherwise noted, are by the author.<\/strong><\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Dataset:<\/strong> This blog uses publicly available data from FRED (Federal Reserve Economic Data). The series <em>Advance Retail Sales: Department Stores (RSDSELD)<\/em> is published by the U.S. Census Bureau and can be used for analysis and publication with appropriate citation.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Official citation:<br>U.S. Census Bureau, <em>Advance Retail Sales: Department Stores<\/em> [RSDSELD], retrieved from FRED, Federal Reserve Bank of St. Louis; <a class=\"\" href=\"https:\/\/fred.stlouisfed.org\/series\/RSDSELD\">https:\/\/fred.stlouisfed.org\/series\/RSDSELD<\/a>, July 7, 2025.<\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><strong>Thanks for reading!<\/strong><\/p>\n\n\n\n<p class=\"wp-block-paragraph\"><\/p>\n","protected":false},"excerpt":{"rendered":"<p>STL Decomposition excels when seasonal patterns evolve over time.<\/p>\n","protected":false},"author":18,"featured_media":606535,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"is_member_only":false,"sub_heading":"STL Decomposition excels when seasonal patterns evolve over time.","footnotes":""},"categories":[44],"tags":[468,446,467,2180,458],"sponsor":[],"coauthors":[32566],"class_list":["post-606534","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-data-science","tag-deep-dives","tag-machine-learning","tag-python","tag-time-series","tag-time-series-forecasting"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science\" \/>\n<meta property=\"og:description\" content=\"STL Decomposition excels when seasonal patterns evolve over time.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\" \/>\n<meta property=\"og:site_name\" content=\"Towards Data Science\" \/>\n<meta property=\"article:published_time\" content=\"2025-07-09T19:06:11+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-07-09T19:06:26+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1771\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Nikhil Dasari\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@TDataScience\" \/>\n<meta name=\"twitter:site\" content=\"@TDataScience\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Nikhil Dasari\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"25 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\"},\"author\":{\"name\":\"TDS Editors\",\"@id\":\"https:\/\/towardsdatascience.com\/#\/schema\/person\/f9925d336b6fe962b03ad8281d90b8ee\"},\"headline\":\"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition\",\"datePublished\":\"2025-07-09T19:06:11+00:00\",\"dateModified\":\"2025-07-09T19:06:26+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\"},\"wordCount\":4165,\"publisher\":{\"@id\":\"https:\/\/towardsdatascience.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg\",\"keywords\":[\"Deep Dives\",\"Machine Learning\",\"Python\",\"Time Series\",\"Time Series Forecasting\"],\"articleSection\":[\"Data Science\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\",\"url\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\",\"name\":\"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science\",\"isPartOf\":{\"@id\":\"https:\/\/towardsdatascience.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg\",\"datePublished\":\"2025-07-09T19:06:11+00:00\",\"dateModified\":\"2025-07-09T19:06:26+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage\",\"url\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg\",\"contentUrl\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg\",\"width\":2560,\"height\":1771,\"caption\":\"Photo by Lucian Alexe via Unsplash\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/towardsdatascience.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/towardsdatascience.com\/#website\",\"url\":\"https:\/\/towardsdatascience.com\/\",\"name\":\"Towards Data Science\",\"description\":\"Publish AI, ML &amp; data-science insights to a global community of data professionals.\",\"publisher\":{\"@id\":\"https:\/\/towardsdatascience.com\/#organization\"},\"alternateName\":\"TDS\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/towardsdatascience.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/towardsdatascience.com\/#organization\",\"name\":\"Towards Data Science\",\"alternateName\":\"TDS\",\"url\":\"https:\/\/towardsdatascience.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/towardsdatascience.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/02\/tds-logo.jpg\",\"contentUrl\":\"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/02\/tds-logo.jpg\",\"width\":696,\"height\":696,\"caption\":\"Towards Data Science\"},\"image\":{\"@id\":\"https:\/\/towardsdatascience.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/TDataScience\",\"https:\/\/www.youtube.com\/c\/TowardsDataScience\",\"https:\/\/www.linkedin.com\/company\/towards-data-science\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/towardsdatascience.com\/#\/schema\/person\/f9925d336b6fe962b03ad8281d90b8ee\",\"name\":\"TDS Editors\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/towardsdatascience.com\/#\/schema\/person\/image\/23494c9101089ad44ae88ce9d2f56aac\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&r=g\",\"caption\":\"TDS Editors\"},\"description\":\"Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly\/write-for-tds\",\"url\":\"https:\/\/towardsdatascience.com\/author\/towardsdatascience\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/","og_locale":"en_US","og_type":"article","og_title":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science","og_description":"STL Decomposition excels when seasonal patterns evolve over time.","og_url":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/","og_site_name":"Towards Data Science","article_published_time":"2025-07-09T19:06:11+00:00","article_modified_time":"2025-07-09T19:06:26+00:00","og_image":[{"width":2560,"height":1771,"url":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg","type":"image\/jpeg"}],"author":"Nikhil Dasari","twitter_card":"summary_large_image","twitter_creator":"@TDataScience","twitter_site":"@TDataScience","twitter_misc":{"Written by":"Nikhil Dasari","Est. reading time":"25 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#article","isPartOf":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/"},"author":{"name":"TDS Editors","@id":"https:\/\/towardsdatascience.com\/#\/schema\/person\/f9925d336b6fe962b03ad8281d90b8ee"},"headline":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition","datePublished":"2025-07-09T19:06:11+00:00","dateModified":"2025-07-09T19:06:26+00:00","mainEntityOfPage":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/"},"wordCount":4165,"publisher":{"@id":"https:\/\/towardsdatascience.com\/#organization"},"image":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage"},"thumbnailUrl":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg","keywords":["Deep Dives","Machine Learning","Python","Time Series","Time Series Forecasting"],"articleSection":["Data Science"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/","url":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/","name":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition | Towards Data Science","isPartOf":{"@id":"https:\/\/towardsdatascience.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage"},"image":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage"},"thumbnailUrl":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg","datePublished":"2025-07-09T19:06:11+00:00","dateModified":"2025-07-09T19:06:26+00:00","breadcrumb":{"@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#primaryimage","url":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg","contentUrl":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/07\/lucian-alexe-f2xfTOv0p9Y-unsplash-scaled-1.jpeg","width":2560,"height":1771,"caption":"Photo by Lucian Alexe via Unsplash"},{"@type":"BreadcrumbList","@id":"https:\/\/towardsdatascience.com\/time-series-forecasting-made-simple-part-3-1-stl-decomposition-understanding-initial-trend-and-seasonality-prior-to-loess-smoothing\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/towardsdatascience.com\/"},{"@type":"ListItem","position":2,"name":"Time Series Forecasting Made Simple (Part 3.1): STL Decomposition"}]},{"@type":"WebSite","@id":"https:\/\/towardsdatascience.com\/#website","url":"https:\/\/towardsdatascience.com\/","name":"Towards Data Science","description":"Publish AI, ML &amp; data-science insights to a global community of data professionals.","publisher":{"@id":"https:\/\/towardsdatascience.com\/#organization"},"alternateName":"TDS","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/towardsdatascience.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/towardsdatascience.com\/#organization","name":"Towards Data Science","alternateName":"TDS","url":"https:\/\/towardsdatascience.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/towardsdatascience.com\/#\/schema\/logo\/image\/","url":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/02\/tds-logo.jpg","contentUrl":"https:\/\/towardsdatascience.com\/wp-content\/uploads\/2025\/02\/tds-logo.jpg","width":696,"height":696,"caption":"Towards Data Science"},"image":{"@id":"https:\/\/towardsdatascience.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/TDataScience","https:\/\/www.youtube.com\/c\/TowardsDataScience","https:\/\/www.linkedin.com\/company\/towards-data-science\/"]},{"@type":"Person","@id":"https:\/\/towardsdatascience.com\/#\/schema\/person\/f9925d336b6fe962b03ad8281d90b8ee","name":"TDS Editors","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/towardsdatascience.com\/#\/schema\/person\/image\/23494c9101089ad44ae88ce9d2f56aac","url":"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&r=g","caption":"TDS Editors"},"description":"Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly\/write-for-tds","url":"https:\/\/towardsdatascience.com\/author\/towardsdatascience\/"}]}},"distributor_meta":false,"distributor_terms":false,"distributor_media":false,"distributor_original_site_name":"TDS Contributor Portal","distributor_original_site_url":"https:\/\/contributor.insightmediagroup.io","push-errors":false,"_links":{"self":[{"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/posts\/606534","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/users\/18"}],"replies":[{"embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/comments?post=606534"}],"version-history":[{"count":0,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/posts\/606534\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/media\/606535"}],"wp:attachment":[{"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/media?parent=606534"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/categories?post=606534"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/tags?post=606534"},{"taxonomy":"sponsor","embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/sponsor?post=606534"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/towardsdatascience.com\/wp-json\/wp\/v2\/coauthors?post=606534"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}