AI

Alaskan Seafood Recipes Optimized by Nutrition AI

Last September I pulled a twenty-pound king salmon out of the Susitna River, brought it home to the cabin, and realized I had absolutely no idea what the...

Last September I pulled a twenty-pound king salmon out of the Susitna River, brought it home to the cabin, and realized I had absolutely no idea what the optimal way to prepare it was from a nutritional standpoint. I knew how to grill it. I knew how to smoke it. I knew my grandmother's recipe for salmon croquettes. What I didn't know was whether any of those methods were actually preserving the omega-3 fatty acids I was supposedly catching this fish for in the first place.

So I did what any software engineer with too much curiosity and a free evening would do. I built something.

Build Your Own AI Agent From Scratch

Build Your Own AI Agent From Scratch

Build a complete AI agent from scratch in Python — no frameworks, no hype. 16 chapters covering tools, memory, reasoning, MCP, multi-agent systems & more.

Learn More

The Problem with Cooking by Instinct

I've been living in Caswell Lakes, Alaska for a while now. Up here, wild-caught seafood isn't a luxury — it's a staple. Salmon, halibut, rockfish, Dungeness crab, spot prawns when you can get them. During the summer months, my freezer looks like a commercial fishing operation exploded inside it.

But here's what nobody talks about: how you prepare seafood dramatically changes its nutritional profile. Deep frying a piece of halibut versus poaching it isn't just a calorie difference. You're looking at changes in bioavailable protein, fat-soluble vitamin retention, omega-3 degradation from heat, and mineral loss from various cooking liquids. The science on this is well-established in food chemistry literature, but nobody has made it accessible to actual home cooks.

I'm 52 years old. I think about things like heart health and inflammation now. When I'm eating wild salmon three times a week, I want to know I'm actually getting the nutritional benefit I think I'm getting.

Traditional recipe sites are useless for this. They'll tell you the salmon is "healthy" and leave it at that. Nutrition databases like the USDA FoodData Central give you raw numbers but don't account for cooking method variations. And most meal planning apps treat a piece of salmon the same whether you blackened it in cast iron or gently steamed it over cedar.


Building a Nutrition-Aware Recipe Optimizer

The idea was straightforward: take a set of ingredients (specifically Alaskan seafood), combine them with cooking method data and nutritional degradation models, and use AI to suggest recipes that maximize specific nutritional goals.

I built the first prototype in Node.js over a weekend. The architecture is simple enough that I'm a little embarrassed by it, but it works.

var express = require("express");
var bodyParser = require("body-parser");
var OpenAI = require("openai");

var app = express();
app.use(bodyParser.json());

var openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

var nutritionProfiles = {
  "king_salmon": {
    omega3_per_100g: 2.3,
    protein_per_100g: 20.4,
    selenium_mcg: 36.5,
    vitamin_d_iu: 526,
    heat_sensitivity: {
      omega3_loss_per_minute_at_350f: 0.018,
      vitamin_d_loss_per_minute_at_350f: 0.012,
      protein_denaturation_threshold_f: 160,
    },
  },
  "halibut": {
    omega3_per_100g: 0.5,
    protein_per_100g: 22.5,
    selenium_mcg: 45.6,
    vitamin_d_iu: 196,
    heat_sensitivity: {
      omega3_loss_per_minute_at_350f: 0.022,
      vitamin_d_loss_per_minute_at_350f: 0.015,
      protein_denaturation_threshold_f: 155,
    },
  },
};

var cookingMethods = {
  "poach": { avg_temp_f: 180, avg_time_min: 12, fat_added: false },
  "steam": { avg_temp_f: 212, avg_time_min: 10, fat_added: false },
  "grill": { avg_temp_f: 450, avg_time_min: 8, fat_added: false },
  "pan_sear": { avg_temp_f: 400, avg_time_min: 6, fat_added: true },
  "bake": { avg_temp_f: 375, avg_time_min: 18, fat_added: false },
  "deep_fry": { avg_temp_f: 375, avg_time_min: 5, fat_added: true },
  "smoke": { avg_temp_f: 225, avg_time_min: 120, fat_added: false },
  "raw_cure": { avg_temp_f: 38, avg_time_min: 1440, fat_added: false },
};

function estimateNutrientRetention(fish, method) {
  var profile = nutritionProfiles[fish];
  var cooking = cookingMethods[method];
  if (!profile || !cooking) return null;

  var tempFactor = cooking.avg_temp_f / 350;
  var omega3Retained =
    1 -
    profile.heat_sensitivity.omega3_loss_per_minute_at_350f *
      tempFactor *
      cooking.avg_time_min;
  var vitDRetained =
    1 -
    profile.heat_sensitivity.vitamin_d_loss_per_minute_at_350f *
      tempFactor *
      cooking.avg_time_min;

  omega3Retained = Math.max(omega3Retained, 0.15);
  vitDRetained = Math.max(vitDRetained, 0.20);

  return {
    omega3_mg: Math.round(profile.omega3_per_100g * 1000 * omega3Retained),
    vitamin_d_iu: Math.round(profile.vitamin_d_iu * vitDRetained),
    protein_g: profile.protein_per_100g * 0.95,
    selenium_mcg: profile.selenium_mcg * 0.88,
    method: method,
    retention_score: Math.round((omega3Retained + vitDRetained) / 2 * 100),
  };
}

The estimateNutrientRetention function is where the interesting decisions happen. The degradation model is simplified — real food science involves dozens of variables including pH, moisture content, surface area, and whether you're cooking with oil — but it's accurate enough to rank cooking methods meaningfully.


Feeding the Data to an LLM

The raw retention numbers are useful but not exactly inspiring. Nobody wants to eat dinner based on a spreadsheet. So I added an AI layer that takes the nutritional analysis and generates actual recipes optimized for the user's goals.

app.post("/api/optimize-recipe", function (req, res) {
  var fish = req.body.fish;
  var goal = req.body.goal || "omega3";
  var servings = req.body.servings || 2;
  var dietaryRestrictions = req.body.restrictions || [];

  var methodResults = Object.keys(cookingMethods).map(function (method) {
    return estimateNutrientRetention(fish, method);
  });

  methodResults.sort(function (a, b) {
    if (goal === "omega3") return b.omega3_mg - a.omega3_mg;
    if (goal === "vitamin_d") return b.vitamin_d_iu - a.vitamin_d_iu;
    return b.retention_score - a.retention_score;
  });

  var topMethods = methodResults.slice(0, 3);

  var prompt =
    "You are a nutrition-focused chef specializing in Alaskan seafood. " +
    "Create a recipe for " + fish.replace("_", " ") +
    " for " + servings + " servings. " +
    "Use one of these cooking methods ranked by " + goal +
    " retention: " + JSON.stringify(topMethods) + ". " +
    "Dietary restrictions: " +
    (dietaryRestrictions.length > 0
      ? dietaryRestrictions.join(", ")
      : "none") +
    ". " +
    "Include estimated nutrition per serving based on the retention data. " +
    "Use ingredients commonly available in rural Alaska. " +
    "Format as a practical recipe with clear steps.";

  openai.chat.completions
    .create({
      model: "gpt-4o-mini",
      messages: [{ role: "user", content: prompt }],
      max_tokens: 1500,
    })
    .then(function (response) {
      res.json({
        recipe: response.choices[0].message.content,
        nutritionAnalysis: topMethods,
        selectedMethod: topMethods[0].method,
        goal: goal,
      });
    })
    .catch(function (err) {
      console.error("Recipe generation failed:", err.message);
      res.status(500).json({ error: "Recipe generation failed" });
    });
});

The key insight here is that the AI isn't doing the nutritional analysis. The deterministic model handles that. The AI is doing the creative work — turning "poach king salmon, retain 87% omega-3s" into "Poached King Salmon with Dill and Lemon over Wild Rice." It's the right division of labor. You don't want an LLM doing math. You want it doing language.


What the Data Actually Showed Me

Running every fish through every cooking method produced some results that genuinely changed how I cook.

Smoking salmon destroys more omega-3s than I expected. The low temperature is deceptive. At 225 degrees for two hours, you're looking at 40-50% omega-3 degradation. The extended time at moderate heat adds up. Smoked salmon is delicious. It is not the nutritional powerhouse people think it is.

Poaching is the undisputed champion for nutrient retention. Low temperature, short time, no added fat. Poached salmon retains roughly 85-90% of its omega-3 content. The downside is that poached fish is, to put it gently, not the most exciting meal. This is where the AI recipe optimization actually shines — it generates flavor-forward poaching liquids (court-bouillon with birch syrup and juniper berries, for example) that make the method appealing.

Grilling is worse than pan-searing. This surprised me. The extreme heat of grilling (450+ degrees) causes rapid omega-3 oxidation even though the cooking time is short. A quick pan sear at 400 degrees for six minutes retains more than grilling at 450 for eight minutes. The math doesn't lie.

Raw preparations win everything. Cured salmon (gravlax-style) retains essentially 100% of its nutrients. The citric acid in a ceviche-style preparation doesn't degrade omega-3s the way heat does. If you can get past the texture preferences, raw or cured preparations are nutritionally optimal.


Scaling to a Full Meal Planner

Once the single-recipe optimizer worked, I couldn't resist building it out into a weekly meal planner. The idea was to take my actual freezer inventory and plan a week of meals that hit specific nutritional targets.

function generateWeeklyPlan(freezerInventory, weeklyTargets) {
  var plan = [];
  var remainingTargets = {
    omega3_mg: weeklyTargets.omega3_mg || 11200,
    vitamin_d_iu: weeklyTargets.vitamin_d_iu || 4200,
    protein_g: weeklyTargets.protein_g || 350,
  };

  var days = [
    "Monday", "Tuesday", "Wednesday",
    "Thursday", "Friday", "Saturday", "Sunday",
  ];

  days.forEach(function (day) {
    var bestCombo = null;
    var bestScore = -1;

    freezerInventory.forEach(function (fish) {
      if (fish.portions <= 0) return;

      Object.keys(cookingMethods).forEach(function (method) {
        var retention = estimateNutrientRetention(fish.type, method);
        if (!retention) return;

        var score = 0;
        score += (retention.omega3_mg / remainingTargets.omega3_mg) * 40;
        score += (retention.vitamin_d_iu / remainingTargets.vitamin_d_iu) * 30;
        score += (retention.protein_g / remainingTargets.protein_g) * 30;

        if (plan.length > 0) {
          var lastMethod = plan[plan.length - 1].method;
          if (method !== lastMethod) score += 10;
          var lastFish = plan[plan.length - 1].fish;
          if (fish.type !== lastFish) score += 15;
        }

        if (score > bestScore) {
          bestScore = score;
          bestCombo = { day: day, fish: fish.type, method: method, retention: retention };
        }
      });
    });

    if (bestCombo) {
      plan.push(bestCombo);
      remainingTargets.omega3_mg -= bestCombo.retention.omega3_mg;
      remainingTargets.vitamin_d_iu -= bestCombo.retention.vitamin_d_iu;
      remainingTargets.protein_g -= bestCombo.retention.protein_g;

      var fishItem = freezerInventory.find(function (f) {
        return f.type === bestCombo.fish;
      });
      if (fishItem) fishItem.portions -= 1;
    }
  });

  return plan;
}

The scoring function is deliberately simple. It weights omega-3 retention highest (40%), then vitamin D (30%), then protein (30%). It also adds a variety bonus so you're not eating poached salmon seven days straight, which is what a pure optimization would suggest.


The AI Recipe Quality Problem

Here's where I have to be honest about what worked and what didn't. The AI-generated recipes range from genuinely excellent to mildly delusional.

The good ones are really good. When you give the model specific constraints — "poached halibut, maximize selenium retention, ingredients available in rural Alaska, serves two" — it produces focused, creative recipes. I got a poached halibut recipe with a birch syrup glaze and roasted root vegetables that was legitimately one of the best things I've cooked this year.

The bad ones reveal the model's lack of Alaskan context. One recipe called for "fresh lemongrass from your local Asian market." I live in Caswell Lakes. The nearest Asian market is a four-hour drive to Anchorage, and that's if the roads are clear. Another recipe suggested "quickly blanching fresh asparagus" in January. In Alaska. Where January means negative twenty and the only fresh vegetables come from a can or the once-a-week delivery truck if it makes it through.

I solved this by adding a hard ingredient whitelist to the prompt:

var alaskanPantryItems = [
  "wild rice", "birch syrup", "spruce tips", "fireweed honey",
  "dried cranberries", "sourdough bread", "potatoes",
  "onions", "garlic", "carrots", "cabbage",
  "canned tomatoes", "dried herbs", "butter",
  "olive oil", "vinegar", "soy sauce",
  "black pepper", "salt", "lemon juice",
  "frozen berries", "oats", "flour",
];

Constraining the ingredient list dramatically improved recipe quality. The model stopped suggesting ingredients I couldn't get and started getting creative with what I actually have. Necessity is the mother of invention, even for language models.


Nutritional Validation Against Published Research

I didn't want to trust my degradation model blindly, so I cross-referenced my estimates against published food science papers. The results were encouraging but humbling.

My omega-3 degradation estimates were within 8-12% of published values for most cooking methods. That's acceptable for meal planning purposes. Where I was off was in smoking — the published literature shows even higher degradation than my model predicted because smoke compounds interact with fatty acids in ways my simple time-temperature model doesn't capture.

The vitamin D retention numbers were harder to validate because the published research varies wildly depending on the specific study methodology. I ended up averaging across three meta-analyses and adjusting my coefficients accordingly.

Here's the important caveat: this is a planning tool, not a clinical instrument. If someone has specific medical nutritional requirements, they should be working with a dietitian, not a Node.js application I built in my cabin. But for the purpose of "should I grill this salmon or poach it if I care about omega-3s," the model is more than adequate.


What I Actually Eat Now

The tool changed my cooking habits in concrete ways.

I poach fish more often. I never would have poached a piece of salmon before this project. Now it's my go-to method for weeknight meals. The AI-generated court-bouillon recipes (white wine, bay leaf, peppercorns, a splash of birch syrup) produce genuinely delicious results with minimal effort and maximum nutritional retention.

I make more gravlax. The raw-cure method retains everything, and it requires zero active cooking time. Salt, sugar, dill, a couple days in the fridge, done. It's the lazy engineer's optimal preparation.

I grill less frequently. This was the hardest behavioral change. Grilled salmon over alder wood is practically a religious experience up here. But knowing the omega-3 numbers makes me save it for weekends and special occasions rather than defaulting to it three times a week.

I smoke strategically. I still smoke salmon — it's too good not to — but I think of it as a flavor-focused preparation rather than a health-focused one. When I want nutrition, I poach. When I want to make the cabin smell incredible, I smoke.


Lessons for Other AI-Augmented Lifestyle Projects

If you're thinking about building something similar — not necessarily for seafood, but any domain where you want AI to optimize personal decisions — here's what I learned:

Keep the deterministic model separate from the AI layer. The nutrition calculations don't need a language model. They need math. Use AI for the creative, language-heavy parts and keep the analytical core in plain code you can test and validate.

Constrain the AI's context aggressively. The more specific your constraints, the better the output. "Generate a recipe" produces generic slop. "Generate a recipe for poached king salmon using only these 20 ingredients, optimized for omega-3 retention, serving two" produces something actually useful.

Validate against ground truth. If your model makes claims about the physical world — nutritional content, weather patterns, material properties — check those claims against published data. AI is a terrible physicist but a decent creative partner. Don't confuse the two.

Build for your actual life. The most useful tools I've built are ones that solve problems I genuinely have. I live in Alaska. I catch fish. I want to eat well. That specificity made this project useful in a way that a generic meal planning app never could be.


The Bigger Picture

This project started as a weekend experiment and turned into something I use every week. It's not a commercial product. It's not trying to be. It's a tool that makes my life slightly better by applying software engineering to a domain I care about.

That's the part about AI that excites me most — not the grand enterprise transformation narratives, but the small, personal applications. A fisherman in Alaska who wants to eat better. A gardener who wants to optimize planting schedules. A woodworker who wants to predict material costs. These are problems that don't justify hiring a data scientist but absolutely justify a weekend with Node.js and an API key.

The fish doesn't care whether you're using cutting-edge technology or a cast iron pan. But if the technology helps you treat it right — cook it in a way that respects both the flavor and the nutrition — then I'd say that's time well spent.

Now if you'll excuse me, I've got a piece of sockeye that my model says should be poached at 175 degrees for exactly eleven minutes. The algorithm has spoken.


Shane Larson is a software engineer, author, and the founder of Grizzly Peak Software. He writes code from a cabin in Caswell Lakes, Alaska, where the salmon are wild and the debugging sessions are long.

Powered by Contentful